Apr 16 15:11:30.852226 ip-10-0-135-252 systemd[1]: Starting Kubernetes Kubelet... Apr 16 15:11:31.279291 ip-10-0-135-252 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 15:11:31.279291 ip-10-0-135-252 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 15:11:31.279291 ip-10-0-135-252 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 15:11:31.279291 ip-10-0-135-252 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 15:11:31.279291 ip-10-0-135-252 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 15:11:31.280442 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.279941 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 15:11:31.283856 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283839 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:31.283856 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283856 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283860 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283863 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283867 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283869 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283872 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283875 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283878 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283880 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283883 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283886 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283888 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283891 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283894 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283896 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283899 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283901 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283904 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283907 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283909 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:31.283922 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283911 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283914 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283916 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283919 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283942 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283946 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283949 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283952 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283954 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283957 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283959 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283962 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283965 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283967 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283970 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283972 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283975 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283977 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283980 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283982 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:31.284411 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283986 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283988 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283991 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283994 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283997 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.283999 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284002 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284004 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284007 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284009 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284012 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284014 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284018 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284024 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284027 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284030 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284032 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284035 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284038 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284040 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:31.284893 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284044 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284046 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284049 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284058 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284061 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284064 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284067 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284069 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284072 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284074 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284077 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284080 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284082 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284086 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284088 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284093 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284096 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284099 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284102 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:31.285388 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284105 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284108 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284111 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284113 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284116 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.284119 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285675 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285683 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285686 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285690 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285693 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285696 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285698 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285701 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285704 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285707 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285709 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285712 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285715 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285718 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:31.285832 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285720 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285722 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285725 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285728 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285732 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285735 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285739 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285742 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285744 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285747 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285749 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285752 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285754 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285757 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285761 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285764 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285766 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285769 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285772 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:31.286346 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285775 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285778 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285781 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285784 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285786 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285789 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285791 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285794 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285796 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285799 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285802 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285804 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285807 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285809 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285812 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285814 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285817 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285821 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285824 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:31.286813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285828 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285832 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285835 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285838 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285840 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285843 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285846 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285849 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285860 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285864 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285867 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285870 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285872 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285875 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285877 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285880 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285883 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285885 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285888 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285890 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285893 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:31.287302 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285896 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285899 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285901 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285904 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285906 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285909 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285912 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285915 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285917 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285921 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285935 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285938 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.285941 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286008 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286015 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286028 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286033 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286038 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286041 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286045 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286051 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 15:11:31.287813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286054 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286057 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286061 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286064 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286067 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286069 2567 flags.go:64] FLAG: --cgroup-root="" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286072 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286075 2567 flags.go:64] FLAG: --client-ca-file="" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286078 2567 flags.go:64] FLAG: --cloud-config="" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286081 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286083 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286088 2567 flags.go:64] FLAG: --cluster-domain="" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286091 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286094 2567 flags.go:64] FLAG: --config-dir="" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286097 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286100 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286104 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286107 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286110 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286113 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286116 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286119 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286123 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286126 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286129 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 15:11:31.288345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286133 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286137 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286140 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286142 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286145 2567 flags.go:64] FLAG: --enable-server="true" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286149 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286153 2567 flags.go:64] FLAG: --event-burst="100" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286157 2567 flags.go:64] FLAG: --event-qps="50" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286159 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286162 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286165 2567 flags.go:64] FLAG: --eviction-hard="" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286169 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286173 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286176 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286179 2567 flags.go:64] FLAG: --eviction-soft="" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286182 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286185 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286188 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286191 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286194 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286197 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286199 2567 flags.go:64] FLAG: --feature-gates="" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286203 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286206 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286210 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 15:11:31.288967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286213 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286216 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286219 2567 flags.go:64] FLAG: --help="false" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286222 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286226 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286229 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286232 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286235 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286239 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286242 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286245 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286248 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286251 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286254 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286257 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286260 2567 flags.go:64] FLAG: --kube-reserved="" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286263 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286266 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286269 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286272 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286275 2567 flags.go:64] FLAG: --lock-file="" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286277 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286280 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286283 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 15:11:31.289651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286288 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286291 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286294 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286297 2567 flags.go:64] FLAG: --logging-format="text" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286300 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286303 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286306 2567 flags.go:64] FLAG: --manifest-url="" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286309 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286313 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286316 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286320 2567 flags.go:64] FLAG: --max-pods="110" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286324 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286327 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286330 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286333 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286335 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286339 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286342 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286350 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286353 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286357 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286360 2567 flags.go:64] FLAG: --pod-cidr="" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286363 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 15:11:31.290278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286369 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286372 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286375 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286378 2567 flags.go:64] FLAG: --port="10250" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286381 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286384 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00d463fa8bdff5d44" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286387 2567 flags.go:64] FLAG: --qos-reserved="" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286390 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286393 2567 flags.go:64] FLAG: --register-node="true" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286396 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286399 2567 flags.go:64] FLAG: --register-with-taints="" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286403 2567 flags.go:64] FLAG: --registry-burst="10" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286408 2567 flags.go:64] FLAG: --registry-qps="5" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286411 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286414 2567 flags.go:64] FLAG: --reserved-memory="" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286418 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286421 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286424 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286427 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286429 2567 flags.go:64] FLAG: --runonce="false" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286432 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286435 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286438 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286441 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286444 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286447 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 15:11:31.290833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286450 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286453 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286456 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286459 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286465 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286468 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286472 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286475 2567 flags.go:64] FLAG: --system-cgroups="" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286478 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286483 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286486 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286489 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286493 2567 flags.go:64] FLAG: --tls-min-version="" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286496 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286499 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286502 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286505 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286508 2567 flags.go:64] FLAG: --v="2" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286513 2567 flags.go:64] FLAG: --version="false" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286517 2567 flags.go:64] FLAG: --vmodule="" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286521 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286524 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286618 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286622 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:31.291498 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286625 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286628 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286632 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286634 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286637 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286640 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286642 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286645 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286648 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286650 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286653 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286656 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286660 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286663 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286665 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286668 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286670 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286673 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286675 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286678 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:31.292197 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286681 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286683 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286686 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286689 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286691 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286693 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286698 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286700 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286703 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286706 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286708 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286711 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286713 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286716 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286718 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286739 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286743 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286746 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286749 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:31.292738 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286752 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286755 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286758 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286760 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286764 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286767 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286770 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286773 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286776 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286778 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286781 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286784 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286786 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286789 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286791 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286794 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286797 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286801 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286804 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286808 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:31.293225 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286811 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286814 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286816 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286819 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286822 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286825 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286828 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286831 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286833 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286836 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286838 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286841 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286844 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286846 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286849 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286852 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286854 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286859 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286861 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286864 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:31.293716 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286866 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286869 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286871 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286874 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.286878 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.286883 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.293320 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.293335 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293380 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293384 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293388 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293391 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293394 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293397 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293400 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:31.294235 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293402 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293405 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293408 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293410 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293413 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293415 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293418 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293421 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293423 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293426 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293428 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293431 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293433 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293436 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293440 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293443 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293447 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293450 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293453 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:31.294619 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293456 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293459 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293461 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293465 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293467 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293471 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293474 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293477 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293479 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293482 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293485 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293487 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293490 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293493 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293495 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293498 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293500 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293502 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293505 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293507 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:31.295105 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293510 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293512 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293515 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293520 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293524 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293527 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293530 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293533 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293536 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293539 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293542 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293545 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293548 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293550 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293553 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293556 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293559 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293561 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293564 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293566 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:31.295600 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293569 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293571 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293574 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293577 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293580 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293582 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293585 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293587 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293590 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293593 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293595 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293598 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293601 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293604 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293606 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293609 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293611 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293614 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293616 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:31.296133 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293619 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.293624 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293718 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293722 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293725 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293728 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293732 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293736 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293739 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293741 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293744 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293746 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293749 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293752 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293755 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293757 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:31.296591 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293760 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293762 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293765 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293767 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293770 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293773 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293776 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293779 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293782 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293786 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293789 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293792 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293795 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293798 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293801 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293804 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293806 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293809 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293812 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:31.297055 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293820 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293823 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293825 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293828 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293831 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293834 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293836 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293839 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293841 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293844 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293847 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293849 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293860 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293864 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293867 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293869 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293872 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293874 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293877 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293879 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:31.297531 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293882 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293884 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293887 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293890 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293892 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293895 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293898 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293900 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293903 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293905 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293908 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293911 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293913 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293922 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293939 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293942 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293945 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293948 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293950 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293952 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:31.298101 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293955 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293958 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293960 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293963 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293965 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293968 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293971 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293973 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293976 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293978 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293980 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293983 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:31.293986 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.293992 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.294680 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 15:11:31.298665 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.296600 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 15:11:31.299135 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.297547 2567 server.go:1019] "Starting client certificate rotation" Apr 16 15:11:31.299135 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.297654 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 15:11:31.299135 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.298451 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 15:11:31.322173 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.322153 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 15:11:31.328110 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.328087 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 15:11:31.349616 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.349593 2567 log.go:25] "Validated CRI v1 runtime API" Apr 16 15:11:31.354774 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.354759 2567 log.go:25] "Validated CRI v1 image API" Apr 16 15:11:31.356085 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.356068 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 15:11:31.358909 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.358886 2567 fs.go:135] Filesystem UUIDs: map[18a47455-46d2-4031-899a-d7535e9d5831:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ae09a7ce-9136-4817-bf38-f895b43f6ed2:/dev/nvme0n1p4] Apr 16 15:11:31.358999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.358908 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 15:11:31.364732 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.364610 2567 manager.go:217] Machine: {Timestamp:2026-04-16 15:11:31.362649214 +0000 UTC m=+0.392220583 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099894 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2485377341232d5c8367dfbc7baa01 SystemUUID:ec248537-7341-232d-5c83-67dfbc7baa01 BootID:1bbba95b-4033-43c6-a390-14698140b964 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e4:d3:50:76:41 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e4:d3:50:76:41 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:63:67:73:97:4d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 15:11:31.365569 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.365558 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 15:11:31.365662 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.365650 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 15:11:31.366734 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.366717 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 15:11:31.368155 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.368132 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 15:11:31.368307 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.368158 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-252.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 15:11:31.368350 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.368317 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 15:11:31.368350 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.368325 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 15:11:31.368350 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.368338 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 15:11:31.369138 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.369128 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 15:11:31.370499 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.370489 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 15:11:31.370606 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.370597 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 15:11:31.372954 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.372944 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 16 15:11:31.373018 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.372962 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 15:11:31.373018 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.372975 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 15:11:31.373018 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.372985 2567 kubelet.go:397] "Adding apiserver pod source" Apr 16 15:11:31.373018 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.372994 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 15:11:31.374142 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.374131 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 15:11:31.374185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.374148 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 15:11:31.376958 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.376939 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 15:11:31.378982 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.378968 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 15:11:31.380667 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380656 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380673 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380684 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380690 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380695 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380702 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380708 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380714 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380721 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380728 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 15:11:31.380736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380740 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 15:11:31.381002 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.380751 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 15:11:31.381540 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.381526 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 15:11:31.381540 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.381537 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 15:11:31.384889 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.384875 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 15:11:31.384974 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.384912 2567 server.go:1295] "Started kubelet" Apr 16 15:11:31.385062 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.385016 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 15:11:31.385095 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.385053 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 15:11:31.385123 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.385113 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 15:11:31.385861 ip-10-0-135-252 systemd[1]: Started Kubernetes Kubelet. Apr 16 15:11:31.386321 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.386164 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 15:11:31.387581 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.387562 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 16 15:11:31.392048 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.392018 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-252.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 15:11:31.392145 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.392110 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-252.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 15:11:31.392145 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.392094 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 15:11:31.394535 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.394515 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 15:11:31.394650 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.394632 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 15:11:31.395482 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.392047 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-252.ec2.internal.18a6deff5bf28b5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-252.ec2.internal,UID:ip-10-0-135-252.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-252.ec2.internal,},FirstTimestamp:2026-04-16 15:11:31.384888154 +0000 UTC m=+0.414459522,LastTimestamp:2026-04-16 15:11:31.384888154 +0000 UTC m=+0.414459522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-252.ec2.internal,}" Apr 16 15:11:31.395726 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.395668 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 15:11:31.395726 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.395658 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 15:11:31.395870 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.395747 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 15:11:31.395870 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.395839 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 16 15:11:31.396677 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.396652 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:31.396677 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.395851 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 16 15:11:31.396817 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.396737 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 15:11:31.397107 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.397090 2567 factory.go:153] Registering CRI-O factory Apr 16 15:11:31.397173 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.397110 2567 factory.go:223] Registration of the crio container factory successfully Apr 16 15:11:31.397173 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.397171 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 15:11:31.397240 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.397180 2567 factory.go:55] Registering systemd factory Apr 16 15:11:31.397240 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.397188 2567 factory.go:223] Registration of the systemd container factory successfully Apr 16 15:11:31.397240 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.397207 2567 factory.go:103] Registering Raw factory Apr 16 15:11:31.397240 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.397222 2567 manager.go:1196] Started watching for new ooms in manager Apr 16 15:11:31.398496 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.398481 2567 manager.go:319] Starting recovery of all containers Apr 16 15:11:31.408416 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.408290 2567 manager.go:324] Recovery completed Apr 16 15:11:31.410578 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.410546 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-252.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 15:11:31.410704 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.410685 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 15:11:31.412517 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.412500 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:31.414536 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.414517 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:31.414601 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.414548 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:31.414601 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.414562 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:31.415051 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.415036 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 15:11:31.415106 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.415052 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 15:11:31.415106 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.415091 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 15:11:31.417248 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.417235 2567 policy_none.go:49] "None policy: Start" Apr 16 15:11:31.417326 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.417253 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 15:11:31.417326 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.417264 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 16 15:11:31.418105 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.418028 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-252.ec2.internal.18a6deff5db6ef7e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-252.ec2.internal,UID:ip-10-0-135-252.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-252.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-252.ec2.internal,},FirstTimestamp:2026-04-16 15:11:31.414536062 +0000 UTC m=+0.444107438,LastTimestamp:2026-04-16 15:11:31.414536062 +0000 UTC m=+0.444107438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-252.ec2.internal,}" Apr 16 15:11:31.432366 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.432303 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-252.ec2.internal.18a6deff5db73c6e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-252.ec2.internal,UID:ip-10-0-135-252.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-135-252.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-135-252.ec2.internal,},FirstTimestamp:2026-04-16 15:11:31.414555758 +0000 UTC m=+0.444127130,LastTimestamp:2026-04-16 15:11:31.414555758 +0000 UTC m=+0.444127130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-252.ec2.internal,}" Apr 16 15:11:31.448537 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.448524 2567 manager.go:341] "Starting Device Plugin manager" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.448558 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.448568 2567 server.go:85] "Starting device plugin registration server" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.448805 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.448824 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.448971 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.449040 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.449049 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.449477 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.449512 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:31.461952 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.457721 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-252.ec2.internal.18a6deff5db76627 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-252.ec2.internal,UID:ip-10-0-135-252.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-135-252.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-135-252.ec2.internal,},FirstTimestamp:2026-04-16 15:11:31.414566439 +0000 UTC m=+0.444137810,LastTimestamp:2026-04-16 15:11:31.414566439 +0000 UTC m=+0.444137810,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-252.ec2.internal,}" Apr 16 15:11:31.468472 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.468412 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-252.ec2.internal.18a6deff5fdc1803 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-252.ec2.internal,UID:ip-10-0-135-252.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-135-252.ec2.internal,},FirstTimestamp:2026-04-16 15:11:31.450525699 +0000 UTC m=+0.480097058,LastTimestamp:2026-04-16 15:11:31.450525699 +0000 UTC m=+0.480097058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-252.ec2.internal,}" Apr 16 15:11:31.501838 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.501819 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sdxr5" Apr 16 15:11:31.519487 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.519462 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sdxr5" Apr 16 15:11:31.527802 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.527778 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 15:11:31.528959 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.528942 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 15:11:31.529033 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.528970 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 15:11:31.529033 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.528987 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 15:11:31.529033 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.528993 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 15:11:31.529033 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.529023 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 15:11:31.550209 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.550158 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:31.550490 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.550474 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:31.550887 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.550871 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:31.550974 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.550901 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:31.550974 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.550911 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:31.550974 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.550947 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.566707 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.566689 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.566707 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.566706 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-252.ec2.internal\": node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:31.622467 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.622442 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:31.629528 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.629508 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal"] Apr 16 15:11:31.629591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.629582 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:31.631121 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.631094 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:31.631121 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.631120 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:31.631252 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.631131 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:31.632315 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.632303 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:31.632471 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.632457 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.632517 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.632486 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:31.632978 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.632961 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:31.633071 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.632989 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:31.633071 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.632999 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:31.633071 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.632961 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:31.633071 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.633066 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:31.633219 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.633077 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:31.634584 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.634569 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.634664 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.634600 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:31.635265 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.635249 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:31.635321 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.635279 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:31.635321 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.635294 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:31.658574 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.658552 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-252.ec2.internal\" not found" node="ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.662780 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.662764 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-252.ec2.internal\" not found" node="ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.698311 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.698290 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9504c7dace0e1dd78455cb89197ac884-config\") pod \"kube-apiserver-proxy-ip-10-0-135-252.ec2.internal\" (UID: \"9504c7dace0e1dd78455cb89197ac884\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.698376 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.698316 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e6d557d9a02308f56d7757f97df43f77-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal\" (UID: \"e6d557d9a02308f56d7757f97df43f77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.698376 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.698341 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6d557d9a02308f56d7757f97df43f77-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal\" (UID: \"e6d557d9a02308f56d7757f97df43f77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.723156 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.723133 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:31.798986 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.798955 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e6d557d9a02308f56d7757f97df43f77-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal\" (UID: \"e6d557d9a02308f56d7757f97df43f77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.799103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.798989 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6d557d9a02308f56d7757f97df43f77-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal\" (UID: \"e6d557d9a02308f56d7757f97df43f77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.799103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.799008 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9504c7dace0e1dd78455cb89197ac884-config\") pod \"kube-apiserver-proxy-ip-10-0-135-252.ec2.internal\" (UID: \"9504c7dace0e1dd78455cb89197ac884\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.799103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.799056 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6d557d9a02308f56d7757f97df43f77-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal\" (UID: \"e6d557d9a02308f56d7757f97df43f77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.799103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.799056 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e6d557d9a02308f56d7757f97df43f77-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal\" (UID: \"e6d557d9a02308f56d7757f97df43f77\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.799103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.799085 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9504c7dace0e1dd78455cb89197ac884-config\") pod \"kube-apiserver-proxy-ip-10-0-135-252.ec2.internal\" (UID: \"9504c7dace0e1dd78455cb89197ac884\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.824165 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.824114 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:31.924596 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:31.924568 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:31.962890 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.962870 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" Apr 16 15:11:31.965031 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:31.965011 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal" Apr 16 15:11:32.024951 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:32.024916 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:32.126053 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:32.125982 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:32.226472 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:32.226439 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:32.297728 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.297697 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 15:11:32.298307 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.297844 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 15:11:32.327131 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:32.327111 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:32.395316 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.395243 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 15:11:32.427255 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:32.427229 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:32.429123 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.429106 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 15:11:32.489669 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.489643 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:32.521682 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.521639 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:06:31 +0000 UTC" deadline="2028-02-01 10:19:26.637842872 +0000 UTC" Apr 16 15:11:32.521682 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.521678 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15739h7m54.11616911s" Apr 16 15:11:32.527808 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:32.527785 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:32.534491 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:32.534467 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d557d9a02308f56d7757f97df43f77.slice/crio-1fc53bc998d966a35ed646b8ccea1e9a6026cbb253f795b69d59a2fc7b751d7f WatchSource:0}: Error finding container 1fc53bc998d966a35ed646b8ccea1e9a6026cbb253f795b69d59a2fc7b751d7f: Status 404 returned error can't find the container with id 1fc53bc998d966a35ed646b8ccea1e9a6026cbb253f795b69d59a2fc7b751d7f Apr 16 15:11:32.535227 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:32.535210 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9504c7dace0e1dd78455cb89197ac884.slice/crio-bccfd0fe4d53691eec16aabbcb28842366e6098d3f8f2f3581c913539dbb987d WatchSource:0}: Error finding container bccfd0fe4d53691eec16aabbcb28842366e6098d3f8f2f3581c913539dbb987d: Status 404 returned error can't find the container with id bccfd0fe4d53691eec16aabbcb28842366e6098d3f8f2f3581c913539dbb987d Apr 16 15:11:32.539606 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.539591 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:11:32.556264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.556245 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ll6mc" Apr 16 15:11:32.566014 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.565995 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ll6mc" Apr 16 15:11:32.628008 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:32.627983 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-252.ec2.internal\" not found" Apr 16 15:11:32.672768 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.672681 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:32.695566 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.695543 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal" Apr 16 15:11:32.712995 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.712972 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:32.717371 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.717356 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 15:11:32.718302 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.718290 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" Apr 16 15:11:32.736486 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:32.736468 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 15:11:33.374358 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.374288 2567 apiserver.go:52] "Watching apiserver" Apr 16 15:11:33.382921 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.382756 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 15:11:33.384914 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.384883 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4","openshift-cluster-node-tuning-operator/tuned-fw4z2","openshift-dns/node-resolver-j9vwn","openshift-image-registry/node-ca-h97tg","openshift-multus/network-metrics-daemon-x8njb","openshift-network-diagnostics/network-check-target-29h6w","openshift-ovn-kubernetes/ovnkube-node-9jzvn","kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal","openshift-multus/multus-additional-cni-plugins-7kth7","openshift-multus/multus-nll6d","openshift-network-operator/iptables-alerter-pgdhn","kube-system/konnectivity-agent-kn9zn"] Apr 16 15:11:33.387972 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.387950 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.389325 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.389303 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.389975 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.389954 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.393700 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.393631 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:33.393804 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:33.393781 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:33.395566 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.395547 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:33.395659 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:33.395612 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:33.396864 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.396843 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.398248 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.398211 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.398983 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.398964 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-756ks\"" Apr 16 15:11:33.399163 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.399141 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:11:33.399233 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.399175 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 15:11:33.399948 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.399635 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.399948 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.399772 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:33.400569 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.400550 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 15:11:33.400865 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.400844 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 15:11:33.402996 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.402965 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 15:11:33.404148 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.403399 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 15:11:33.404148 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.403543 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.404148 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.403601 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.404148 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.404046 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 15:11:33.404962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.404603 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 15:11:33.404962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.404785 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 15:11:33.405079 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.405009 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 15:11:33.407200 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407090 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-systemd\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.407200 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407122 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-lib-modules\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.407200 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2da4fddf-5318-4d67-9672-73870158cdf2-serviceca\") pod \"node-ca-h97tg\" (UID: \"2da4fddf-5318-4d67-9672-73870158cdf2\") " pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.407200 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-socket-dir-parent\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.407436 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407223 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-924j8\" (UniqueName: \"kubernetes.io/projected/da21a633-b9e5-4a37-b9bc-12d29b6b666b-kube-api-access-924j8\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.407436 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407284 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-sysctl-d\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.407436 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407314 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-var-lib-cni-bin\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.407436 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407343 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-registration-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.407436 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407369 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-system-cni-dir\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.407436 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407393 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.407436 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407417 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-modprobe-d\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407441 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chdmr\" (UniqueName: \"kubernetes.io/projected/2da4fddf-5318-4d67-9672-73870158cdf2-kube-api-access-chdmr\") pod \"node-ca-h97tg\" (UID: \"2da4fddf-5318-4d67-9672-73870158cdf2\") " pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407468 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-os-release\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407494 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-var-lib-cni-multus\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407524 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-conf-dir\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407546 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-kubernetes\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407571 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-sysconfig\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407595 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-device-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407643 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-sysctl-conf\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407672 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-cni-dir\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407695 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-etc-kubernetes\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407716 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-tuned\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407737 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2da4fddf-5318-4d67-9672-73870158cdf2-host\") pod \"node-ca-h97tg\" (UID: \"2da4fddf-5318-4d67-9672-73870158cdf2\") " pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.407760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407766 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2df6\" (UniqueName: \"kubernetes.io/projected/ba267359-2c95-4792-991e-a2e9eae5b290-kube-api-access-d2df6\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407792 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407840 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-socket-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407865 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbfs9\" (UniqueName: \"kubernetes.io/projected/073a04d8-8eac-4abf-9567-c84c5466b74d-kube-api-access-vbfs9\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407889 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-run-multus-certs\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407915 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-var-lib-kubelet\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407951 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-sys-fs\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.407977 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e86b66b-dc3f-46f0-b201-2e891f1d30a9-iptables-alerter-script\") pod \"iptables-alerter-pgdhn\" (UID: \"2e86b66b-dc3f-46f0-b201-2e891f1d30a9\") " pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408001 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-sys\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408029 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-host\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408056 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/da21a633-b9e5-4a37-b9bc-12d29b6b666b-cni-binary-copy\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408080 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-run-netns\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408112 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-tmp\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408138 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-system-cni-dir\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408162 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-var-lib-kubelet\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408184 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-hostroot\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408207 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-daemon-config\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.408346 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408230 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e86b66b-dc3f-46f0-b201-2e891f1d30a9-host-slash\") pod \"iptables-alerter-pgdhn\" (UID: \"2e86b66b-dc3f-46f0-b201-2e891f1d30a9\") " pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408254 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6stfm\" (UniqueName: \"kubernetes.io/projected/2e86b66b-dc3f-46f0-b201-2e891f1d30a9-kube-api-access-6stfm\") pod \"iptables-alerter-pgdhn\" (UID: \"2e86b66b-dc3f-46f0-b201-2e891f1d30a9\") " pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408278 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-cnibin\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408307 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-run\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408345 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrck9\" (UniqueName: \"kubernetes.io/projected/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-kube-api-access-zrck9\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408370 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408399 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-cnibin\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408423 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-os-release\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408449 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwmsq\" (UniqueName: \"kubernetes.io/projected/c25330d1-d516-4851-8786-0d9a8e235f7d-kube-api-access-rwmsq\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408474 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-etc-selinux\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408502 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c25330d1-d516-4851-8786-0d9a8e235f7d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408556 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c25330d1-d516-4851-8786-0d9a8e235f7d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408631 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-run-k8s-cni-cncf-io\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.409208 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.408668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c25330d1-d516-4851-8786-0d9a8e235f7d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.413393 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.413262 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kclpt\"" Apr 16 15:11:33.413474 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.413455 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 15:11:33.413518 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.413469 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:11:33.413518 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.413477 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 15:11:33.413518 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.413504 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 15:11:33.413880 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.413725 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 15:11:33.413880 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.413750 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 15:11:33.413880 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.413768 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 15:11:33.414070 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.414024 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 15:11:33.414221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.414204 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 15:11:33.414680 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.414436 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wjgzc\"" Apr 16 15:11:33.414680 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.414605 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 15:11:33.414810 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.414779 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 15:11:33.414995 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.414973 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lzhhw\"" Apr 16 15:11:33.415193 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.415177 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qw9hk\"" Apr 16 15:11:33.415433 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.415405 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8824q\"" Apr 16 15:11:33.415611 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.415593 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 15:11:33.415862 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.415836 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-dn9rz\"" Apr 16 15:11:33.416112 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.416093 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 15:11:33.416292 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.416276 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 15:11:33.416432 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.416411 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 15:11:33.424236 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.424216 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5pcmx\"" Apr 16 15:11:33.424951 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.424921 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 15:11:33.424951 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.424946 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:33.425261 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.425210 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 15:11:33.425416 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.425401 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-stlgx\"" Apr 16 15:11:33.497070 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.497043 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 15:11:33.509239 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509208 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.509352 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509249 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-device-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.509352 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509278 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d93e980a-c222-444d-a15d-49cf63ac1c76-ovnkube-config\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.509352 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509304 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-sysctl-conf\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.509352 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509326 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-cni-dir\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.509352 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509348 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-etc-kubernetes\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509360 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509373 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-var-lib-openvswitch\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509400 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-run-ovn-kubernetes\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d93e980a-c222-444d-a15d-49cf63ac1c76-env-overrides\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509447 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d93e980a-c222-444d-a15d-49cf63ac1c76-ovnkube-script-lib\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509472 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-tuned\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509497 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2da4fddf-5318-4d67-9672-73870158cdf2-host\") pod \"node-ca-h97tg\" (UID: \"2da4fddf-5318-4d67-9672-73870158cdf2\") " pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509524 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2df6\" (UniqueName: \"kubernetes.io/projected/ba267359-2c95-4792-991e-a2e9eae5b290-kube-api-access-d2df6\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509552 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509568 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-device-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509577 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-socket-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509603 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbfs9\" (UniqueName: \"kubernetes.io/projected/073a04d8-8eac-4abf-9567-c84c5466b74d-kube-api-access-vbfs9\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.509612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509607 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-etc-kubernetes\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509523 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-sysctl-conf\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509673 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2da4fddf-5318-4d67-9672-73870158cdf2-host\") pod \"node-ca-h97tg\" (UID: \"2da4fddf-5318-4d67-9672-73870158cdf2\") " pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509692 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-cni-dir\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509788 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-run-multus-certs\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509805 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-socket-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509861 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-run-ovn\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509899 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-run-multus-certs\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509956 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-var-lib-kubelet\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.509981 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-sys-fs\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510013 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-var-lib-kubelet\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510078 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e86b66b-dc3f-46f0-b201-2e891f1d30a9-iptables-alerter-script\") pod \"iptables-alerter-pgdhn\" (UID: \"2e86b66b-dc3f-46f0-b201-2e891f1d30a9\") " pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-sys-fs\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.510220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-kubelet\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510238 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-run-systemd\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510266 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-cni-bin\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510299 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d93e980a-c222-444d-a15d-49cf63ac1c76-ovn-node-metrics-cert\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510326 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-sys\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-host\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/da21a633-b9e5-4a37-b9bc-12d29b6b666b-cni-binary-copy\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510442 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-run-netns\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510453 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510471 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-etc-openvswitch\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510492 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-tmp\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-system-cni-dir\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510539 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-var-lib-kubelet\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-hostroot\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510571 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-daemon-config\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510648 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e86b66b-dc3f-46f0-b201-2e891f1d30a9-host-slash\") pod \"iptables-alerter-pgdhn\" (UID: \"2e86b66b-dc3f-46f0-b201-2e891f1d30a9\") " pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6stfm\" (UniqueName: \"kubernetes.io/projected/2e86b66b-dc3f-46f0-b201-2e891f1d30a9-kube-api-access-6stfm\") pod \"iptables-alerter-pgdhn\" (UID: \"2e86b66b-dc3f-46f0-b201-2e891f1d30a9\") " pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-cnibin\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.510800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510700 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-run\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510726 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e86b66b-dc3f-46f0-b201-2e891f1d30a9-iptables-alerter-script\") pod \"iptables-alerter-pgdhn\" (UID: \"2e86b66b-dc3f-46f0-b201-2e891f1d30a9\") " pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510741 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrck9\" (UniqueName: \"kubernetes.io/projected/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-kube-api-access-zrck9\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510770 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-cnibin\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510801 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-os-release\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmsq\" (UniqueName: \"kubernetes.io/projected/c25330d1-d516-4851-8786-0d9a8e235f7d-kube-api-access-rwmsq\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510856 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/27ceed8c-3179-48b7-9f1d-9d9a245ded1e-agent-certs\") pod \"konnectivity-agent-kn9zn\" (UID: \"27ceed8c-3179-48b7-9f1d-9d9a245ded1e\") " pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510884 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-etc-selinux\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510896 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/da21a633-b9e5-4a37-b9bc-12d29b6b666b-cni-binary-copy\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.510911 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c25330d1-d516-4851-8786-0d9a8e235f7d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511057 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-sys\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511101 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-host\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511130 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c25330d1-d516-4851-8786-0d9a8e235f7d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511160 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-node-log\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-run-k8s-cni-cncf-io\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c25330d1-d516-4851-8786-0d9a8e235f7d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.511631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511224 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69wrm\" (UniqueName: \"kubernetes.io/projected/adf454f4-3a18-4824-b7ac-7736800ea721-kube-api-access-69wrm\") pod \"node-resolver-j9vwn\" (UID: \"adf454f4-3a18-4824-b7ac-7736800ea721\") " pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511250 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-slash\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-cni-netd\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511301 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-systemd\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511325 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-lib-modules\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2da4fddf-5318-4d67-9672-73870158cdf2-serviceca\") pod \"node-ca-h97tg\" (UID: \"2da4fddf-5318-4d67-9672-73870158cdf2\") " pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511375 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-socket-dir-parent\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:33.511391 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511402 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-924j8\" (UniqueName: \"kubernetes.io/projected/da21a633-b9e5-4a37-b9bc-12d29b6b666b-kube-api-access-924j8\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-systemd-units\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511452 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-cnibin\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:33.511464 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs podName:ba267359-2c95-4792-991e-a2e9eae5b290 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:34.011443627 +0000 UTC m=+3.041015006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs") pod "network-metrics-daemon-x8njb" (UID: "ba267359-2c95-4792-991e-a2e9eae5b290") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511478 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-hostroot\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511488 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-run-openvswitch\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-sysctl-d\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-var-lib-cni-bin\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511578 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/adf454f4-3a18-4824-b7ac-7736800ea721-tmp-dir\") pod \"node-resolver-j9vwn\" (UID: \"adf454f4-3a18-4824-b7ac-7736800ea721\") " pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.512430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511577 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e86b66b-dc3f-46f0-b201-2e891f1d30a9-host-slash\") pod \"iptables-alerter-pgdhn\" (UID: \"2e86b66b-dc3f-46f0-b201-2e891f1d30a9\") " pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511604 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-run-netns\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511675 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-os-release\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511713 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-cnibin\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511755 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-run\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511808 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-sysctl-d\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-system-cni-dir\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511855 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-var-lib-cni-bin\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511551 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-var-lib-kubelet\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.511906 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-run-netns\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512115 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c25330d1-d516-4851-8786-0d9a8e235f7d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512153 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512433 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-daemon-config\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512701 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-systemd\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512795 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-lib-modules\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512830 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-registration-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512860 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-system-cni-dir\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.513212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512887 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512916 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/27ceed8c-3179-48b7-9f1d-9d9a245ded1e-konnectivity-ca\") pod \"konnectivity-agent-kn9zn\" (UID: \"27ceed8c-3179-48b7-9f1d-9d9a245ded1e\") " pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512963 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/adf454f4-3a18-4824-b7ac-7736800ea721-hosts-file\") pod \"node-resolver-j9vwn\" (UID: \"adf454f4-3a18-4824-b7ac-7736800ea721\") " pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.512991 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-modprobe-d\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513020 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chdmr\" (UniqueName: \"kubernetes.io/projected/2da4fddf-5318-4d67-9672-73870158cdf2-kube-api-access-chdmr\") pod \"node-ca-h97tg\" (UID: \"2da4fddf-5318-4d67-9672-73870158cdf2\") " pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513046 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-os-release\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513075 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-var-lib-cni-multus\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513101 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-conf-dir\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513129 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-log-socket\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513153 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2da4fddf-5318-4d67-9672-73870158cdf2-serviceca\") pod \"node-ca-h97tg\" (UID: \"2da4fddf-5318-4d67-9672-73870158cdf2\") " pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7475\" (UniqueName: \"kubernetes.io/projected/d93e980a-c222-444d-a15d-49cf63ac1c76-kube-api-access-r7475\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513281 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-socket-dir-parent\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-run-k8s-cni-cncf-io\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-etc-selinux\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513423 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-kubernetes\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-sysconfig\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513533 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-sysconfig\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.513999 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513627 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c25330d1-d516-4851-8786-0d9a8e235f7d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513694 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-multus-conf-dir\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513740 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/073a04d8-8eac-4abf-9567-c84c5466b74d-registration-dir\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513791 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-os-release\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.513983 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-tuned\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.514016 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/da21a633-b9e5-4a37-b9bc-12d29b6b666b-host-var-lib-cni-multus\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.514051 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-system-cni-dir\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.514299 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c25330d1-d516-4851-8786-0d9a8e235f7d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.514315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-kubernetes\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.514384 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-etc-modprobe-d\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.514824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.514644 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c25330d1-d516-4851-8786-0d9a8e235f7d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.516329 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.516308 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-tmp\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.530234 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:33.530210 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:33.530339 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:33.530238 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:33.530339 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:33.530250 2567 projected.go:194] Error preparing data for projected volume kube-api-access-78lft for pod openshift-network-diagnostics/network-check-target-29h6w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:33.530339 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:33.530320 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft podName:bdec121d-e73f-477e-a6d0-1678f02e535b nodeName:}" failed. No retries permitted until 2026-04-16 15:11:34.030301791 +0000 UTC m=+3.059873167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-78lft" (UniqueName: "kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft") pod "network-check-target-29h6w" (UID: "bdec121d-e73f-477e-a6d0-1678f02e535b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:33.530960 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.530786 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2df6\" (UniqueName: \"kubernetes.io/projected/ba267359-2c95-4792-991e-a2e9eae5b290-kube-api-access-d2df6\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:33.532011 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.531989 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6stfm\" (UniqueName: \"kubernetes.io/projected/2e86b66b-dc3f-46f0-b201-2e891f1d30a9-kube-api-access-6stfm\") pod \"iptables-alerter-pgdhn\" (UID: \"2e86b66b-dc3f-46f0-b201-2e891f1d30a9\") " pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.532101 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.532012 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwmsq\" (UniqueName: \"kubernetes.io/projected/c25330d1-d516-4851-8786-0d9a8e235f7d-kube-api-access-rwmsq\") pod \"multus-additional-cni-plugins-7kth7\" (UID: \"c25330d1-d516-4851-8786-0d9a8e235f7d\") " pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.532425 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.532401 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbfs9\" (UniqueName: \"kubernetes.io/projected/073a04d8-8eac-4abf-9567-c84c5466b74d-kube-api-access-vbfs9\") pod \"aws-ebs-csi-driver-node-vl7l4\" (UID: \"073a04d8-8eac-4abf-9567-c84c5466b74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.534719 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.534678 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal" event={"ID":"9504c7dace0e1dd78455cb89197ac884","Type":"ContainerStarted","Data":"bccfd0fe4d53691eec16aabbcb28842366e6098d3f8f2f3581c913539dbb987d"} Apr 16 15:11:33.535951 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.535913 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" event={"ID":"e6d557d9a02308f56d7757f97df43f77","Type":"ContainerStarted","Data":"1fc53bc998d966a35ed646b8ccea1e9a6026cbb253f795b69d59a2fc7b751d7f"} Apr 16 15:11:33.540738 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.540689 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-924j8\" (UniqueName: \"kubernetes.io/projected/da21a633-b9e5-4a37-b9bc-12d29b6b666b-kube-api-access-924j8\") pod \"multus-nll6d\" (UID: \"da21a633-b9e5-4a37-b9bc-12d29b6b666b\") " pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.541069 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.541043 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrck9\" (UniqueName: \"kubernetes.io/projected/f98e480a-3afa-49c5-a57d-cae4dbf3ffb6-kube-api-access-zrck9\") pod \"tuned-fw4z2\" (UID: \"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6\") " pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.541155 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.541134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chdmr\" (UniqueName: \"kubernetes.io/projected/2da4fddf-5318-4d67-9672-73870158cdf2-kube-api-access-chdmr\") pod \"node-ca-h97tg\" (UID: \"2da4fddf-5318-4d67-9672-73870158cdf2\") " pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.566738 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.566702 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:06:32 +0000 UTC" deadline="2027-12-06 19:25:58.457791523 +0000 UTC" Apr 16 15:11:33.566738 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.566733 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14380h14m24.891061678s" Apr 16 15:11:33.613915 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.613881 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-run-openvswitch\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614037 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.613939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/adf454f4-3a18-4824-b7ac-7736800ea721-tmp-dir\") pod \"node-resolver-j9vwn\" (UID: \"adf454f4-3a18-4824-b7ac-7736800ea721\") " pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.614037 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.613990 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-run-openvswitch\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614037 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614033 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-run-netns\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614059 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/27ceed8c-3179-48b7-9f1d-9d9a245ded1e-konnectivity-ca\") pod \"konnectivity-agent-kn9zn\" (UID: \"27ceed8c-3179-48b7-9f1d-9d9a245ded1e\") " pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:33.614185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614096 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/adf454f4-3a18-4824-b7ac-7736800ea721-hosts-file\") pod \"node-resolver-j9vwn\" (UID: \"adf454f4-3a18-4824-b7ac-7736800ea721\") " pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.614185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-log-socket\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614148 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7475\" (UniqueName: \"kubernetes.io/projected/d93e980a-c222-444d-a15d-49cf63ac1c76-kube-api-access-r7475\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d93e980a-c222-444d-a15d-49cf63ac1c76-ovnkube-config\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614182 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-run-netns\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/adf454f4-3a18-4824-b7ac-7736800ea721-tmp-dir\") pod \"node-resolver-j9vwn\" (UID: \"adf454f4-3a18-4824-b7ac-7736800ea721\") " pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614202 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-var-lib-openvswitch\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-run-ovn-kubernetes\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614241 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-log-socket\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614250 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d93e980a-c222-444d-a15d-49cf63ac1c76-env-overrides\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/adf454f4-3a18-4824-b7ac-7736800ea721-hosts-file\") pod \"node-resolver-j9vwn\" (UID: \"adf454f4-3a18-4824-b7ac-7736800ea721\") " pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614277 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d93e980a-c222-444d-a15d-49cf63ac1c76-ovnkube-script-lib\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614145 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614322 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-run-ovn\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614331 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-run-ovn-kubernetes\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614352 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-kubelet\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614377 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-var-lib-openvswitch\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-run-systemd\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614418 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-run-systemd\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614425 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-cni-bin\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.614495 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614453 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d93e980a-c222-444d-a15d-49cf63ac1c76-ovn-node-metrics-cert\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614485 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-cni-bin\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614455 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-run-ovn\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-etc-openvswitch\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614498 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-etc-openvswitch\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614593 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/27ceed8c-3179-48b7-9f1d-9d9a245ded1e-agent-certs\") pod \"konnectivity-agent-kn9zn\" (UID: \"27ceed8c-3179-48b7-9f1d-9d9a245ded1e\") " pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614456 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-kubelet\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614632 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/27ceed8c-3179-48b7-9f1d-9d9a245ded1e-konnectivity-ca\") pod \"konnectivity-agent-kn9zn\" (UID: \"27ceed8c-3179-48b7-9f1d-9d9a245ded1e\") " pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614635 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-node-log\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614667 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-node-log\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69wrm\" (UniqueName: \"kubernetes.io/projected/adf454f4-3a18-4824-b7ac-7736800ea721-kube-api-access-69wrm\") pod \"node-resolver-j9vwn\" (UID: \"adf454f4-3a18-4824-b7ac-7736800ea721\") " pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-slash\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614762 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-cni-netd\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-systemd-units\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614796 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d93e980a-c222-444d-a15d-49cf63ac1c76-ovnkube-config\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614811 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d93e980a-c222-444d-a15d-49cf63ac1c76-env-overrides\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614855 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-cni-netd\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614859 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-host-slash\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.615264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.614992 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d93e980a-c222-444d-a15d-49cf63ac1c76-systemd-units\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.616145 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.615352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d93e980a-c222-444d-a15d-49cf63ac1c76-ovnkube-script-lib\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.617215 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.617196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/27ceed8c-3179-48b7-9f1d-9d9a245ded1e-agent-certs\") pod \"konnectivity-agent-kn9zn\" (UID: \"27ceed8c-3179-48b7-9f1d-9d9a245ded1e\") " pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:33.617388 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.617373 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d93e980a-c222-444d-a15d-49cf63ac1c76-ovn-node-metrics-cert\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.635662 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.635607 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7475\" (UniqueName: \"kubernetes.io/projected/d93e980a-c222-444d-a15d-49cf63ac1c76-kube-api-access-r7475\") pod \"ovnkube-node-9jzvn\" (UID: \"d93e980a-c222-444d-a15d-49cf63ac1c76\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:33.638379 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.638360 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69wrm\" (UniqueName: \"kubernetes.io/projected/adf454f4-3a18-4824-b7ac-7736800ea721-kube-api-access-69wrm\") pod \"node-resolver-j9vwn\" (UID: \"adf454f4-3a18-4824-b7ac-7736800ea721\") " pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.701769 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.701742 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7kth7" Apr 16 15:11:33.710810 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.710783 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h97tg" Apr 16 15:11:33.722606 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.722587 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" Apr 16 15:11:33.729621 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.729603 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" Apr 16 15:11:33.736252 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.736233 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nll6d" Apr 16 15:11:33.745864 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.745840 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pgdhn" Apr 16 15:11:33.752587 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.752569 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:33.761101 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.761082 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j9vwn" Apr 16 15:11:33.768151 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:33.768130 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:34.017610 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.017529 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:34.017730 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:34.017682 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:34.017786 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:34.017770 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs podName:ba267359-2c95-4792-991e-a2e9eae5b290 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:35.0177512 +0000 UTC m=+4.047322561 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs") pod "network-metrics-daemon-x8njb" (UID: "ba267359-2c95-4792-991e-a2e9eae5b290") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:34.118417 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.118382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:34.118594 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:34.118572 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:34.118641 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:34.118598 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:34.118641 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:34.118609 2567 projected.go:194] Error preparing data for projected volume kube-api-access-78lft for pod openshift-network-diagnostics/network-check-target-29h6w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:34.118715 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:34.118666 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft podName:bdec121d-e73f-477e-a6d0-1678f02e535b nodeName:}" failed. No retries permitted until 2026-04-16 15:11:35.118649185 +0000 UTC m=+4.148220547 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-78lft" (UniqueName: "kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft") pod "network-check-target-29h6w" (UID: "bdec121d-e73f-477e-a6d0-1678f02e535b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:34.207002 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:34.206972 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda21a633_b9e5_4a37_b9bc_12d29b6b666b.slice/crio-045dd7b58713f26df3cf0c1db62ae07a08332f557e63b753f807900b88a77023 WatchSource:0}: Error finding container 045dd7b58713f26df3cf0c1db62ae07a08332f557e63b753f807900b88a77023: Status 404 returned error can't find the container with id 045dd7b58713f26df3cf0c1db62ae07a08332f557e63b753f807900b88a77023 Apr 16 15:11:34.213909 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:34.213881 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf98e480a_3afa_49c5_a57d_cae4dbf3ffb6.slice/crio-1d24bb03d8b99706065404d2b0c89deee30a311caa30cc66741aca5611846b99 WatchSource:0}: Error finding container 1d24bb03d8b99706065404d2b0c89deee30a311caa30cc66741aca5611846b99: Status 404 returned error can't find the container with id 1d24bb03d8b99706065404d2b0c89deee30a311caa30cc66741aca5611846b99 Apr 16 15:11:34.215888 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:34.215802 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd93e980a_c222_444d_a15d_49cf63ac1c76.slice/crio-4d246891ea3b8d1395e6dab98f818f3f79cee7371e5504dd66ff2d5fd3144ecc WatchSource:0}: Error finding container 4d246891ea3b8d1395e6dab98f818f3f79cee7371e5504dd66ff2d5fd3144ecc: Status 404 returned error can't find the container with id 4d246891ea3b8d1395e6dab98f818f3f79cee7371e5504dd66ff2d5fd3144ecc Apr 16 15:11:34.216692 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:34.216670 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc25330d1_d516_4851_8786_0d9a8e235f7d.slice/crio-ec2ab071b533ec6d19df9bcc535ef3869517b419050a6a0d2f04f3bfcc95f38d WatchSource:0}: Error finding container ec2ab071b533ec6d19df9bcc535ef3869517b419050a6a0d2f04f3bfcc95f38d: Status 404 returned error can't find the container with id ec2ab071b533ec6d19df9bcc535ef3869517b419050a6a0d2f04f3bfcc95f38d Apr 16 15:11:34.217457 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:34.217245 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e86b66b_dc3f_46f0_b201_2e891f1d30a9.slice/crio-b62285061678b9cb6610e04cc95806e0468d9488bfbaf80b69a548ca297f076f WatchSource:0}: Error finding container b62285061678b9cb6610e04cc95806e0468d9488bfbaf80b69a548ca297f076f: Status 404 returned error can't find the container with id b62285061678b9cb6610e04cc95806e0468d9488bfbaf80b69a548ca297f076f Apr 16 15:11:34.218268 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:34.218194 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073a04d8_8eac_4abf_9567_c84c5466b74d.slice/crio-ce3d2a9e206b58268f770077cee3eb55b73c2cf2e92ab794f3d24088179b472f WatchSource:0}: Error finding container ce3d2a9e206b58268f770077cee3eb55b73c2cf2e92ab794f3d24088179b472f: Status 404 returned error can't find the container with id ce3d2a9e206b58268f770077cee3eb55b73c2cf2e92ab794f3d24088179b472f Apr 16 15:11:34.219251 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:34.219231 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2da4fddf_5318_4d67_9672_73870158cdf2.slice/crio-f463ebe3b67f4e69d98efdc2204c902a6c54aba116836227fbaa70638a27c71c WatchSource:0}: Error finding container f463ebe3b67f4e69d98efdc2204c902a6c54aba116836227fbaa70638a27c71c: Status 404 returned error can't find the container with id f463ebe3b67f4e69d98efdc2204c902a6c54aba116836227fbaa70638a27c71c Apr 16 15:11:34.220084 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:11:34.219879 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27ceed8c_3179_48b7_9f1d_9d9a245ded1e.slice/crio-0fbef80d7f8b2406a96fbaf6788fd7b7652a547f8d4f05ba3bb362c136e97f3a WatchSource:0}: Error finding container 0fbef80d7f8b2406a96fbaf6788fd7b7652a547f8d4f05ba3bb362c136e97f3a: Status 404 returned error can't find the container with id 0fbef80d7f8b2406a96fbaf6788fd7b7652a547f8d4f05ba3bb362c136e97f3a Apr 16 15:11:34.540113 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.539837 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"4d246891ea3b8d1395e6dab98f818f3f79cee7371e5504dd66ff2d5fd3144ecc"} Apr 16 15:11:34.540761 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.540740 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j9vwn" event={"ID":"adf454f4-3a18-4824-b7ac-7736800ea721","Type":"ContainerStarted","Data":"05541fb47ad79d31915b73eb9b1d4befd5149b42f4642a54f50e7ecb5b2530c5"} Apr 16 15:11:34.541674 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.541647 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kn9zn" event={"ID":"27ceed8c-3179-48b7-9f1d-9d9a245ded1e","Type":"ContainerStarted","Data":"0fbef80d7f8b2406a96fbaf6788fd7b7652a547f8d4f05ba3bb362c136e97f3a"} Apr 16 15:11:34.542534 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.542517 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nll6d" event={"ID":"da21a633-b9e5-4a37-b9bc-12d29b6b666b","Type":"ContainerStarted","Data":"045dd7b58713f26df3cf0c1db62ae07a08332f557e63b753f807900b88a77023"} Apr 16 15:11:34.543878 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.543861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal" event={"ID":"9504c7dace0e1dd78455cb89197ac884","Type":"ContainerStarted","Data":"9ee0788ebe1ab657c091e927660903d075246e705e7e45646cc8a0eda89f4647"} Apr 16 15:11:34.546943 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.546910 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h97tg" event={"ID":"2da4fddf-5318-4d67-9672-73870158cdf2","Type":"ContainerStarted","Data":"f463ebe3b67f4e69d98efdc2204c902a6c54aba116836227fbaa70638a27c71c"} Apr 16 15:11:34.550145 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.550125 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" event={"ID":"073a04d8-8eac-4abf-9567-c84c5466b74d","Type":"ContainerStarted","Data":"ce3d2a9e206b58268f770077cee3eb55b73c2cf2e92ab794f3d24088179b472f"} Apr 16 15:11:34.555235 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.555216 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7kth7" event={"ID":"c25330d1-d516-4851-8786-0d9a8e235f7d","Type":"ContainerStarted","Data":"ec2ab071b533ec6d19df9bcc535ef3869517b419050a6a0d2f04f3bfcc95f38d"} Apr 16 15:11:34.556640 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.556621 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pgdhn" event={"ID":"2e86b66b-dc3f-46f0-b201-2e891f1d30a9","Type":"ContainerStarted","Data":"b62285061678b9cb6610e04cc95806e0468d9488bfbaf80b69a548ca297f076f"} Apr 16 15:11:34.560348 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.560328 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" event={"ID":"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6","Type":"ContainerStarted","Data":"1d24bb03d8b99706065404d2b0c89deee30a311caa30cc66741aca5611846b99"} Apr 16 15:11:34.567436 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.567412 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:06:32 +0000 UTC" deadline="2027-09-12 19:08:12.331247105 +0000 UTC" Apr 16 15:11:34.567436 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:34.567434 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12339h56m37.763815396s" Apr 16 15:11:35.024666 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:35.024636 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:35.024801 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:35.024783 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:35.024879 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:35.024854 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs podName:ba267359-2c95-4792-991e-a2e9eae5b290 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:37.024833914 +0000 UTC m=+6.054405292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs") pod "network-metrics-daemon-x8njb" (UID: "ba267359-2c95-4792-991e-a2e9eae5b290") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:35.125694 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:35.125663 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:35.125864 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:35.125837 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:35.125967 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:35.125873 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:35.125967 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:35.125886 2567 projected.go:194] Error preparing data for projected volume kube-api-access-78lft for pod openshift-network-diagnostics/network-check-target-29h6w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:35.125967 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:35.125957 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft podName:bdec121d-e73f-477e-a6d0-1678f02e535b nodeName:}" failed. No retries permitted until 2026-04-16 15:11:37.125939598 +0000 UTC m=+6.155510968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-78lft" (UniqueName: "kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft") pod "network-check-target-29h6w" (UID: "bdec121d-e73f-477e-a6d0-1678f02e535b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:35.529962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:35.529811 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:35.530118 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:35.529983 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:35.530482 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:35.530460 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:35.530580 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:35.530559 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:35.583439 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:35.583346 2567 generic.go:358] "Generic (PLEG): container finished" podID="e6d557d9a02308f56d7757f97df43f77" containerID="c1c599168714bfa98e697c8ccf4f36cae842ef967cfc1d5807c565bb607677d6" exitCode=0 Apr 16 15:11:35.584442 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:35.584232 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" event={"ID":"e6d557d9a02308f56d7757f97df43f77","Type":"ContainerDied","Data":"c1c599168714bfa98e697c8ccf4f36cae842ef967cfc1d5807c565bb607677d6"} Apr 16 15:11:35.615222 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:35.615147 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-252.ec2.internal" podStartSLOduration=3.6151304079999997 podStartE2EDuration="3.615130408s" podCreationTimestamp="2026-04-16 15:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:11:34.568374822 +0000 UTC m=+3.597946200" watchObservedRunningTime="2026-04-16 15:11:35.615130408 +0000 UTC m=+4.644701787" Apr 16 15:11:36.591205 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:36.591168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" event={"ID":"e6d557d9a02308f56d7757f97df43f77","Type":"ContainerStarted","Data":"cdf26798fb396d8beefba3b69a22093466590e58b4150cf61bb0a740446d2771"} Apr 16 15:11:37.046998 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:37.046901 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:37.047165 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:37.047076 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:37.047165 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:37.047153 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs podName:ba267359-2c95-4792-991e-a2e9eae5b290 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:41.047133309 +0000 UTC m=+10.076704668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs") pod "network-metrics-daemon-x8njb" (UID: "ba267359-2c95-4792-991e-a2e9eae5b290") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:37.147884 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:37.147846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:37.148075 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:37.148041 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:37.148075 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:37.148061 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:37.148075 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:37.148075 2567 projected.go:194] Error preparing data for projected volume kube-api-access-78lft for pod openshift-network-diagnostics/network-check-target-29h6w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:37.148241 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:37.148134 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft podName:bdec121d-e73f-477e-a6d0-1678f02e535b nodeName:}" failed. No retries permitted until 2026-04-16 15:11:41.148115636 +0000 UTC m=+10.177687007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-78lft" (UniqueName: "kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft") pod "network-check-target-29h6w" (UID: "bdec121d-e73f-477e-a6d0-1678f02e535b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:37.529761 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:37.529730 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:37.529965 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:37.529730 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:37.529965 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:37.529907 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:37.530109 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:37.529990 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:39.529960 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:39.529914 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:39.530458 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:39.530048 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:39.530458 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:39.530083 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:39.530458 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:39.530198 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:41.078966 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:41.078912 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:41.079446 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:41.079118 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:41.079446 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:41.079182 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs podName:ba267359-2c95-4792-991e-a2e9eae5b290 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:49.079162282 +0000 UTC m=+18.108733645 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs") pod "network-metrics-daemon-x8njb" (UID: "ba267359-2c95-4792-991e-a2e9eae5b290") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:41.180383 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:41.180269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:41.180571 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:41.180463 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:41.180571 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:41.180488 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:41.180571 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:41.180501 2567 projected.go:194] Error preparing data for projected volume kube-api-access-78lft for pod openshift-network-diagnostics/network-check-target-29h6w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:41.180571 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:41.180554 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft podName:bdec121d-e73f-477e-a6d0-1678f02e535b nodeName:}" failed. No retries permitted until 2026-04-16 15:11:49.180540524 +0000 UTC m=+18.210111880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-78lft" (UniqueName: "kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft") pod "network-check-target-29h6w" (UID: "bdec121d-e73f-477e-a6d0-1678f02e535b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:41.533973 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:41.533249 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:41.533973 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:41.533361 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:41.533973 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:41.533749 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:41.533973 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:41.533862 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:43.529616 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:43.529584 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:43.530166 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:43.529671 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:43.530166 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:43.529730 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:43.530166 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:43.529798 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:45.530221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:45.530190 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:45.530667 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:45.530188 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:45.530667 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:45.530314 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:45.530667 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:45.530429 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:47.529450 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.529413 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:47.529450 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.529459 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:47.529948 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:47.529618 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:47.529948 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:47.529751 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:47.768909 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.768851 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-252.ec2.internal" podStartSLOduration=15.768834166 podStartE2EDuration="15.768834166s" podCreationTimestamp="2026-04-16 15:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:11:36.614268742 +0000 UTC m=+5.643840120" watchObservedRunningTime="2026-04-16 15:11:47.768834166 +0000 UTC m=+16.798405546" Apr 16 15:11:47.769138 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.768991 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pg2mn"] Apr 16 15:11:47.783752 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.783682 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:47.783862 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:47.783773 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pg2mn" podUID="52595484-a093-4a5b-8052-226e00ba9507" Apr 16 15:11:47.827060 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.827031 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/52595484-a093-4a5b-8052-226e00ba9507-kubelet-config\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:47.827215 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.827093 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:47.827215 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.827171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/52595484-a093-4a5b-8052-226e00ba9507-dbus\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:47.928094 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.928054 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/52595484-a093-4a5b-8052-226e00ba9507-kubelet-config\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:47.928260 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.928113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:47.928260 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.928144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/52595484-a093-4a5b-8052-226e00ba9507-dbus\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:47.928260 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.928198 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/52595484-a093-4a5b-8052-226e00ba9507-kubelet-config\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:47.928260 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:47.928239 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:47.928430 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:47.928285 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/52595484-a093-4a5b-8052-226e00ba9507-dbus\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:47.928430 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:47.928307 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret podName:52595484-a093-4a5b-8052-226e00ba9507 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:48.428287561 +0000 UTC m=+17.457858928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret") pod "global-pull-secret-syncer-pg2mn" (UID: "52595484-a093-4a5b-8052-226e00ba9507") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:48.431555 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:48.431514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:48.431722 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:48.431649 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:48.431776 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:48.431740 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret podName:52595484-a093-4a5b-8052-226e00ba9507 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:49.431709007 +0000 UTC m=+18.461280386 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret") pod "global-pull-secret-syncer-pg2mn" (UID: "52595484-a093-4a5b-8052-226e00ba9507") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:49.135970 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:49.135921 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:49.136351 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.136067 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:49.136351 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.136139 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs podName:ba267359-2c95-4792-991e-a2e9eae5b290 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:05.136120024 +0000 UTC m=+34.165691380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs") pod "network-metrics-daemon-x8njb" (UID: "ba267359-2c95-4792-991e-a2e9eae5b290") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:49.236847 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:49.236810 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:49.237021 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.236991 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:49.237021 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.237009 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:49.237021 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.237018 2567 projected.go:194] Error preparing data for projected volume kube-api-access-78lft for pod openshift-network-diagnostics/network-check-target-29h6w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:49.237151 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.237071 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft podName:bdec121d-e73f-477e-a6d0-1678f02e535b nodeName:}" failed. No retries permitted until 2026-04-16 15:12:05.237056228 +0000 UTC m=+34.266627584 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-78lft" (UniqueName: "kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft") pod "network-check-target-29h6w" (UID: "bdec121d-e73f-477e-a6d0-1678f02e535b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:49.438764 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:49.438668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:49.438908 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.438837 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:49.439062 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.438918 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret podName:52595484-a093-4a5b-8052-226e00ba9507 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:51.43889583 +0000 UTC m=+20.468467189 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret") pod "global-pull-secret-syncer-pg2mn" (UID: "52595484-a093-4a5b-8052-226e00ba9507") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:49.530002 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:49.529966 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:49.530152 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:49.529977 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:49.530152 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.530105 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:49.530230 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:49.529979 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:49.530230 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.530184 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pg2mn" podUID="52595484-a093-4a5b-8052-226e00ba9507" Apr 16 15:11:49.530392 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:49.530345 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:51.456557 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.456525 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:51.456949 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:51.456657 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:51.456949 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:51.456720 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret podName:52595484-a093-4a5b-8052-226e00ba9507 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:55.456701813 +0000 UTC m=+24.486273169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret") pod "global-pull-secret-syncer-pg2mn" (UID: "52595484-a093-4a5b-8052-226e00ba9507") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:51.530566 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.530279 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:51.530661 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:51.530609 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:51.530715 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.530681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:51.530764 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:51.530750 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:51.531646 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.530879 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:51.531646 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:51.530985 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pg2mn" podUID="52595484-a093-4a5b-8052-226e00ba9507" Apr 16 15:11:51.617867 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.617840 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h97tg" event={"ID":"2da4fddf-5318-4d67-9672-73870158cdf2","Type":"ContainerStarted","Data":"14abe1682ee5dd176cd8ff431ccfc698b46c1430bbf0c0761ceb4289914a5daf"} Apr 16 15:11:51.619229 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.619210 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" event={"ID":"073a04d8-8eac-4abf-9567-c84c5466b74d","Type":"ContainerStarted","Data":"30e06f751d0593c14311930c617dd10b272e67163efc833667fe8acca826ee7d"} Apr 16 15:11:51.620197 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.620176 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7kth7" event={"ID":"c25330d1-d516-4851-8786-0d9a8e235f7d","Type":"ContainerStarted","Data":"906c5d6000baed74a1af27aece30d5d46b138fcb31bf3da5319454b2696a0fa3"} Apr 16 15:11:51.621168 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.621149 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" event={"ID":"f98e480a-3afa-49c5-a57d-cae4dbf3ffb6","Type":"ContainerStarted","Data":"a355ffc8f1c297d77ddf5ca1b27180c06724a01ff7322bf9a97f03d0e81cce1a"} Apr 16 15:11:51.622448 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.622432 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"64fbbe1aa9149aca25c7e514b761d1fa8f9eec643cb070b910159bbc61d7d649"} Apr 16 15:11:51.622490 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.622454 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"76256ba128bcee898ce799b955d19fa431cca985d608e200a015cce24f881755"} Apr 16 15:11:51.623632 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.623615 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j9vwn" event={"ID":"adf454f4-3a18-4824-b7ac-7736800ea721","Type":"ContainerStarted","Data":"6d39b46dc3e4745a8e100b608d4ad46509ee33755b80e994b02884055de37d57"} Apr 16 15:11:51.624781 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.624762 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kn9zn" event={"ID":"27ceed8c-3179-48b7-9f1d-9d9a245ded1e","Type":"ContainerStarted","Data":"ef994d92d1e1bb83014845e23ce2f1935d4d03c1187639e85e8fea623f040efd"} Apr 16 15:11:51.626074 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.626056 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nll6d" event={"ID":"da21a633-b9e5-4a37-b9bc-12d29b6b666b","Type":"ContainerStarted","Data":"13040b200ff828581c6ab2957b60331cd100f28940ce2132ec81c8cdbe084f4c"} Apr 16 15:11:51.692385 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.692342 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h97tg" podStartSLOduration=8.269548448 podStartE2EDuration="20.692327033s" podCreationTimestamp="2026-04-16 15:11:31 +0000 UTC" firstStartedPulling="2026-04-16 15:11:34.221703465 +0000 UTC m=+3.251274832" lastFinishedPulling="2026-04-16 15:11:46.644482047 +0000 UTC m=+15.674053417" observedRunningTime="2026-04-16 15:11:51.652804714 +0000 UTC m=+20.682376086" watchObservedRunningTime="2026-04-16 15:11:51.692327033 +0000 UTC m=+20.721898389" Apr 16 15:11:51.692941 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.692898 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kn9zn" podStartSLOduration=3.61974008 podStartE2EDuration="20.692888198s" podCreationTimestamp="2026-04-16 15:11:31 +0000 UTC" firstStartedPulling="2026-04-16 15:11:34.223088963 +0000 UTC m=+3.252660332" lastFinishedPulling="2026-04-16 15:11:51.296237077 +0000 UTC m=+20.325808450" observedRunningTime="2026-04-16 15:11:51.69263773 +0000 UTC m=+20.722209110" watchObservedRunningTime="2026-04-16 15:11:51.692888198 +0000 UTC m=+20.722459554" Apr 16 15:11:51.818141 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.817855 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nll6d" podStartSLOduration=3.620530267 podStartE2EDuration="20.817838038s" podCreationTimestamp="2026-04-16 15:11:31 +0000 UTC" firstStartedPulling="2026-04-16 15:11:34.210854578 +0000 UTC m=+3.240425941" lastFinishedPulling="2026-04-16 15:11:51.408162345 +0000 UTC m=+20.437733712" observedRunningTime="2026-04-16 15:11:51.789549182 +0000 UTC m=+20.819120573" watchObservedRunningTime="2026-04-16 15:11:51.817838038 +0000 UTC m=+20.847409394" Apr 16 15:11:51.860839 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.860791 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fw4z2" podStartSLOduration=3.915049968 podStartE2EDuration="20.860777156s" podCreationTimestamp="2026-04-16 15:11:31 +0000 UTC" firstStartedPulling="2026-04-16 15:11:34.215568687 +0000 UTC m=+3.245140044" lastFinishedPulling="2026-04-16 15:11:51.16129587 +0000 UTC m=+20.190867232" observedRunningTime="2026-04-16 15:11:51.860599759 +0000 UTC m=+20.890171137" watchObservedRunningTime="2026-04-16 15:11:51.860777156 +0000 UTC m=+20.890348535" Apr 16 15:11:51.861045 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:51.860893 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-j9vwn" podStartSLOduration=2.922408651 podStartE2EDuration="19.860887657s" podCreationTimestamp="2026-04-16 15:11:32 +0000 UTC" firstStartedPulling="2026-04-16 15:11:34.222820488 +0000 UTC m=+3.252391845" lastFinishedPulling="2026-04-16 15:11:51.16129949 +0000 UTC m=+20.190870851" observedRunningTime="2026-04-16 15:11:51.818216255 +0000 UTC m=+20.847787614" watchObservedRunningTime="2026-04-16 15:11:51.860887657 +0000 UTC m=+20.890459035" Apr 16 15:11:52.191636 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.191601 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:52.192325 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.192304 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:52.450526 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.450499 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 15:11:52.460539 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.460399 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T15:11:52.45052336Z","UUID":"d634d69d-4eb9-4f8f-ac8d-112f6e56b97d","Handler":null,"Name":"","Endpoint":""} Apr 16 15:11:52.462150 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.462132 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 15:11:52.462242 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.462156 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 15:11:52.628888 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.628857 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" event={"ID":"073a04d8-8eac-4abf-9567-c84c5466b74d","Type":"ContainerStarted","Data":"adccd94e28435e78a54061cb522a11c89f0a41a970fd5681a46964069fff0b15"} Apr 16 15:11:52.630227 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.630200 2567 generic.go:358] "Generic (PLEG): container finished" podID="c25330d1-d516-4851-8786-0d9a8e235f7d" containerID="906c5d6000baed74a1af27aece30d5d46b138fcb31bf3da5319454b2696a0fa3" exitCode=0 Apr 16 15:11:52.630339 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.630278 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7kth7" event={"ID":"c25330d1-d516-4851-8786-0d9a8e235f7d","Type":"ContainerDied","Data":"906c5d6000baed74a1af27aece30d5d46b138fcb31bf3da5319454b2696a0fa3"} Apr 16 15:11:52.631540 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.631511 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pgdhn" event={"ID":"2e86b66b-dc3f-46f0-b201-2e891f1d30a9","Type":"ContainerStarted","Data":"0350f81fcd3dc3357cae9ba7f32449c4e6d1d73c54c928cb857a8114b990895f"} Apr 16 15:11:52.633759 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.633741 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:11:52.634043 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.634024 2567 generic.go:358] "Generic (PLEG): container finished" podID="d93e980a-c222-444d-a15d-49cf63ac1c76" containerID="64fbbe1aa9149aca25c7e514b761d1fa8f9eec643cb070b910159bbc61d7d649" exitCode=1 Apr 16 15:11:52.634140 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.634117 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerDied","Data":"64fbbe1aa9149aca25c7e514b761d1fa8f9eec643cb070b910159bbc61d7d649"} Apr 16 15:11:52.634253 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.634150 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"98cc272f17f3555b7f7cabd5c27fd3111cf59a02c94d9709a4eab186be6367b2"} Apr 16 15:11:52.634253 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.634163 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"65aec1ef6655dfe1f9a06e2f11a6c5c050c0eaf383ccb618e1c63ae29bbc27b2"} Apr 16 15:11:52.634253 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.634175 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"09f5dcc8e98b041ba7f6d2d17486d43596590baf245657311750e02db7dc3a07"} Apr 16 15:11:52.634253 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.634187 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"ab6c5f7e89e364b67ece724a3ff1e12a1ced3afb745b629f53f511bb5defaf6d"} Apr 16 15:11:52.918576 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:52.918528 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pgdhn" podStartSLOduration=4.841883347 podStartE2EDuration="21.918510599s" podCreationTimestamp="2026-04-16 15:11:31 +0000 UTC" firstStartedPulling="2026-04-16 15:11:34.21961059 +0000 UTC m=+3.249181946" lastFinishedPulling="2026-04-16 15:11:51.296237823 +0000 UTC m=+20.325809198" observedRunningTime="2026-04-16 15:11:52.916770252 +0000 UTC m=+21.946341630" watchObservedRunningTime="2026-04-16 15:11:52.918510599 +0000 UTC m=+21.948081982" Apr 16 15:11:53.529203 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:53.529176 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:53.529558 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:53.529270 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pg2mn" podUID="52595484-a093-4a5b-8052-226e00ba9507" Apr 16 15:11:53.529558 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:53.529291 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:53.529558 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:53.529309 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:53.529558 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:53.529418 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:53.529558 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:53.529504 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:53.637568 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:53.637482 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" event={"ID":"073a04d8-8eac-4abf-9567-c84c5466b74d","Type":"ContainerStarted","Data":"294778c65c9b9be5d7e3cc5c7d1336a5a25c8d15b60c4e131ebec318d0ef49ef"} Apr 16 15:11:53.637568 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:53.637527 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 15:11:53.728352 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:53.728302 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vl7l4" podStartSLOduration=3.5898629399999997 podStartE2EDuration="22.728287135s" podCreationTimestamp="2026-04-16 15:11:31 +0000 UTC" firstStartedPulling="2026-04-16 15:11:34.220280574 +0000 UTC m=+3.249851934" lastFinishedPulling="2026-04-16 15:11:53.358704757 +0000 UTC m=+22.388276129" observedRunningTime="2026-04-16 15:11:53.727789246 +0000 UTC m=+22.757360627" watchObservedRunningTime="2026-04-16 15:11:53.728287135 +0000 UTC m=+22.757858512" Apr 16 15:11:54.643004 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:54.642741 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:11:54.643521 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:54.643491 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"e74676ab7d8d075ce92be3846ebd794a15249f4b418381e544670e1364d3bccf"} Apr 16 15:11:55.486347 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:55.486303 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:55.486539 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:55.486446 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:55.486539 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:55.486517 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret podName:52595484-a093-4a5b-8052-226e00ba9507 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:03.48649855 +0000 UTC m=+32.516069930 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret") pod "global-pull-secret-syncer-pg2mn" (UID: "52595484-a093-4a5b-8052-226e00ba9507") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:11:55.529854 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:55.529813 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:55.529854 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:55.529842 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:55.530079 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:55.529813 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:55.530079 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:55.529958 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pg2mn" podUID="52595484-a093-4a5b-8052-226e00ba9507" Apr 16 15:11:55.530190 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:55.530084 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:55.530190 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:55.530180 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:57.391951 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.391771 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:57.392549 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.392056 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 15:11:57.392549 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.392507 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kn9zn" Apr 16 15:11:57.532470 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.532408 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:57.532583 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.532411 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:57.532583 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:57.532508 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:57.532583 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.532415 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:57.532679 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:57.532564 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pg2mn" podUID="52595484-a093-4a5b-8052-226e00ba9507" Apr 16 15:11:57.532679 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:57.532639 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:57.650544 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.650513 2567 generic.go:358] "Generic (PLEG): container finished" podID="c25330d1-d516-4851-8786-0d9a8e235f7d" containerID="e4c65482f9994e1e302e0541dc4cdc52df7b505de6b2d41ea3e798deb856d99a" exitCode=0 Apr 16 15:11:57.650688 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.650600 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7kth7" event={"ID":"c25330d1-d516-4851-8786-0d9a8e235f7d","Type":"ContainerDied","Data":"e4c65482f9994e1e302e0541dc4cdc52df7b505de6b2d41ea3e798deb856d99a"} Apr 16 15:11:57.653840 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.653804 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:11:57.654243 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.654223 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"d05af1b8778a9cbb7a8f008228533f2ee7a80e327745793847c5160e2c07fabe"} Apr 16 15:11:57.654586 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.654571 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:57.654709 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.654687 2567 scope.go:117] "RemoveContainer" containerID="64fbbe1aa9149aca25c7e514b761d1fa8f9eec643cb070b910159bbc61d7d649" Apr 16 15:11:57.680725 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:57.680702 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:58.658280 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:58.658201 2567 generic.go:358] "Generic (PLEG): container finished" podID="c25330d1-d516-4851-8786-0d9a8e235f7d" containerID="b8ffabbabb70bbbcc1771c8e6a5e6e1f152d376818b7dcfafef6ae621b8b54be" exitCode=0 Apr 16 15:11:58.658694 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:58.658282 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7kth7" event={"ID":"c25330d1-d516-4851-8786-0d9a8e235f7d","Type":"ContainerDied","Data":"b8ffabbabb70bbbcc1771c8e6a5e6e1f152d376818b7dcfafef6ae621b8b54be"} Apr 16 15:11:58.661738 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:58.661719 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:11:58.662067 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:58.662045 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" event={"ID":"d93e980a-c222-444d-a15d-49cf63ac1c76","Type":"ContainerStarted","Data":"fc5d7eb6e20fc85e6a6d4152f0dd9adfce785cb7d5af79933d8ac3ecc00c88c6"} Apr 16 15:11:58.662180 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:58.662169 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 15:11:58.662389 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:58.662373 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:58.676283 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:58.676263 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:11:59.328039 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.327980 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" podStartSLOduration=10.155053209 podStartE2EDuration="27.327956674s" podCreationTimestamp="2026-04-16 15:11:32 +0000 UTC" firstStartedPulling="2026-04-16 15:11:34.217477873 +0000 UTC m=+3.247049230" lastFinishedPulling="2026-04-16 15:11:51.390381326 +0000 UTC m=+20.419952695" observedRunningTime="2026-04-16 15:11:58.8663358 +0000 UTC m=+27.895907190" watchObservedRunningTime="2026-04-16 15:11:59.327956674 +0000 UTC m=+28.357528053" Apr 16 15:11:59.328796 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.328770 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x8njb"] Apr 16 15:11:59.328953 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.328940 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:11:59.329903 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:59.329873 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:11:59.341982 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.341960 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pg2mn"] Apr 16 15:11:59.342086 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.342074 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:11:59.342180 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:59.342162 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pg2mn" podUID="52595484-a093-4a5b-8052-226e00ba9507" Apr 16 15:11:59.342483 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.342464 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-29h6w"] Apr 16 15:11:59.342576 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.342559 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:11:59.342680 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:11:59.342662 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:11:59.665853 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.665811 2567 generic.go:358] "Generic (PLEG): container finished" podID="c25330d1-d516-4851-8786-0d9a8e235f7d" containerID="4ed452927890403866f223e61a117a8e980db3b5b53d2ec438ee57570f4904fc" exitCode=0 Apr 16 15:11:59.666236 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.665889 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7kth7" event={"ID":"c25330d1-d516-4851-8786-0d9a8e235f7d","Type":"ContainerDied","Data":"4ed452927890403866f223e61a117a8e980db3b5b53d2ec438ee57570f4904fc"} Apr 16 15:11:59.666236 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:11:59.666124 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 15:12:00.529984 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:00.529904 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:12:00.530102 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:00.530078 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:12:00.668065 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:00.668037 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 15:12:01.530700 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:01.530510 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:12:01.530863 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:01.530571 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:12:01.530863 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:01.530798 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pg2mn" podUID="52595484-a093-4a5b-8052-226e00ba9507" Apr 16 15:12:01.530863 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:01.530840 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:12:02.529635 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:02.529603 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:12:02.530013 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:02.529751 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:12:03.529843 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:03.529804 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:12:03.530487 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:03.529938 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29h6w" podUID="bdec121d-e73f-477e-a6d0-1678f02e535b" Apr 16 15:12:03.530487 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:03.530164 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:12:03.530487 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:03.530271 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pg2mn" podUID="52595484-a093-4a5b-8052-226e00ba9507" Apr 16 15:12:03.557450 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:03.557414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:12:03.557627 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:03.557583 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:03.557693 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:03.557650 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret podName:52595484-a093-4a5b-8052-226e00ba9507 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:19.557630397 +0000 UTC m=+48.587201754 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret") pod "global-pull-secret-syncer-pg2mn" (UID: "52595484-a093-4a5b-8052-226e00ba9507") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:03.623658 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:03.623629 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:12:03.623884 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:03.623865 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 15:12:03.634233 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:03.634211 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jzvn" Apr 16 15:12:04.326871 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.326848 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-252.ec2.internal" event="NodeReady" Apr 16 15:12:04.327099 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.327006 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 15:12:04.458488 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.458449 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69778bc578-j9plz"] Apr 16 15:12:04.494800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.494766 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf"] Apr 16 15:12:04.495013 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.494959 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.514055 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.514027 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9vkww"] Apr 16 15:12:04.514203 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.514142 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:04.520393 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.520372 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 15:12:04.520393 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.520384 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 15:12:04.520578 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.520565 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 15:12:04.542298 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.542272 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6cpbg"] Apr 16 15:12:04.542677 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.542448 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.542677 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.542456 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:12:04.552736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.552716 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 15:12:04.552848 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.552736 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kfj7d\"" Apr 16 15:12:04.552848 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.552737 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7gtc5\"" Apr 16 15:12:04.552848 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.552721 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 15:12:04.554073 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.553613 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 15:12:04.554073 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.553733 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 15:12:04.554073 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.553873 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xrkdr\"" Apr 16 15:12:04.554073 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.553943 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5mf5n\"" Apr 16 15:12:04.554073 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.553975 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 15:12:04.554375 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.554099 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 15:12:04.558847 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.558555 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69778bc578-j9plz"] Apr 16 15:12:04.558847 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.558578 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf"] Apr 16 15:12:04.558847 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.558591 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9vkww"] Apr 16 15:12:04.558847 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.558702 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:04.562404 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.562384 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6cpbg"] Apr 16 15:12:04.588294 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.588229 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 15:12:04.588446 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.588228 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rc5hb\"" Apr 16 15:12:04.588446 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.588233 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 15:12:04.588970 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.588953 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 15:12:04.667539 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667507 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-config-volume\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.667728 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667555 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-trusted-ca\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.667728 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667582 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-tmp-dir\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.667728 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667605 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ca21b1b-b57f-49b7-9334-fd912b40553d-ca-trust-extracted\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.667728 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667659 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-certificates\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.667962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667742 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:04.667962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667769 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48mh\" (UniqueName: \"kubernetes.io/projected/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-kube-api-access-s48mh\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:04.667962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667813 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-image-registry-private-configuration\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.667962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667838 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.667962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667854 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:04.667962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667874 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-installation-pull-secrets\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.667962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667891 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a7946e2b-4899-4e79-8237-f8184b28abd7-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:04.667962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667909 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvfr2\" (UniqueName: \"kubernetes.io/projected/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-kube-api-access-xvfr2\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.667962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667959 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-bound-sa-token\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.668407 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667983 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m684l\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-kube-api-access-m684l\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.668407 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.667998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.768877 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.768842 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:04.768877 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.768877 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s48mh\" (UniqueName: \"kubernetes.io/projected/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-kube-api-access-s48mh\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:04.769139 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.768920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-image-registry-private-configuration\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.769139 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.768961 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.769139 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.768990 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:04.769139 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:04.769018 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:04.769139 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:04.769088 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert podName:21ef67e8-6503-4ba6-b6ed-bc1016b3958d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:05.269067626 +0000 UTC m=+34.298638997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert") pod "ingress-canary-6cpbg" (UID: "21ef67e8-6503-4ba6-b6ed-bc1016b3958d") : secret "canary-serving-cert" not found Apr 16 15:12:04.769139 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-installation-pull-secrets\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.769139 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:04.769110 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:12:04.769139 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:04.769123 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69778bc578-j9plz: secret "image-registry-tls" not found Apr 16 15:12:04.769139 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769139 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a7946e2b-4899-4e79-8237-f8184b28abd7-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:04.769160 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls podName:4ca21b1b-b57f-49b7-9334-fd912b40553d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:05.269147279 +0000 UTC m=+34.298718649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls") pod "image-registry-69778bc578-j9plz" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d") : secret "image-registry-tls" not found Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769188 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfr2\" (UniqueName: \"kubernetes.io/projected/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-kube-api-access-xvfr2\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769227 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-bound-sa-token\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769275 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m684l\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-kube-api-access-m684l\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769301 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:04.769330 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:04.769390 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert podName:a7946e2b-4899-4e79-8237-f8184b28abd7 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:05.26937196 +0000 UTC m=+34.298943330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tzrgf" (UID: "a7946e2b-4899-4e79-8237-f8184b28abd7") : secret "networking-console-plugin-cert" not found Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769333 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-config-volume\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769440 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-trusted-ca\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769471 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-tmp-dir\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769500 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ca21b1b-b57f-49b7-9334-fd912b40553d-ca-trust-extracted\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.769591 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769537 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-certificates\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.770216 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.769825 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-config-volume\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.770216 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.770182 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-certificates\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.770216 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:04.770187 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:04.770368 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:04.770244 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls podName:aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:05.270228002 +0000 UTC m=+34.299799367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls") pod "dns-default-9vkww" (UID: "aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9") : secret "dns-default-metrics-tls" not found Apr 16 15:12:04.770449 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.770429 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ca21b1b-b57f-49b7-9334-fd912b40553d-ca-trust-extracted\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.771062 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.771042 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-trusted-ca\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.773517 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.773492 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-image-registry-private-configuration\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.773613 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.773520 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-installation-pull-secrets\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.779118 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.779064 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-tmp-dir\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:04.779408 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.779385 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a7946e2b-4899-4e79-8237-f8184b28abd7-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:04.792793 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.792762 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48mh\" (UniqueName: \"kubernetes.io/projected/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-kube-api-access-s48mh\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:04.814281 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.814250 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m684l\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-kube-api-access-m684l\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.820260 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.820230 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-bound-sa-token\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:04.821048 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:04.821026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvfr2\" (UniqueName: \"kubernetes.io/projected/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-kube-api-access-xvfr2\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:05.172412 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.172373 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:12:05.172574 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.172519 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 15:12:05.172612 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.172589 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs podName:ba267359-2c95-4792-991e-a2e9eae5b290 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:37.172574549 +0000 UTC m=+66.202145905 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs") pod "network-metrics-daemon-x8njb" (UID: "ba267359-2c95-4792-991e-a2e9eae5b290") : secret "metrics-daemon-secret" not found Apr 16 15:12:05.272878 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.272846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:05.273034 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.272903 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:05.273034 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.272922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:05.273034 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.272964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:05.273034 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.272999 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:12:05.273034 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273004 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:05.273204 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273055 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 15:12:05.273204 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273069 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert podName:21ef67e8-6503-4ba6-b6ed-bc1016b3958d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:06.273053369 +0000 UTC m=+35.302624728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert") pod "ingress-canary-6cpbg" (UID: "21ef67e8-6503-4ba6-b6ed-bc1016b3958d") : secret "canary-serving-cert" not found Apr 16 15:12:05.273204 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273076 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:05.273204 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273106 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:05.273204 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273058 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:12:05.273204 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273132 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69778bc578-j9plz: secret "image-registry-tls" not found Apr 16 15:12:05.273204 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273120 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:05.273204 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273176 2567 projected.go:194] Error preparing data for projected volume kube-api-access-78lft for pod openshift-network-diagnostics/network-check-target-29h6w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:05.273204 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273123 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls podName:aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:06.273109088 +0000 UTC m=+35.302680465 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls") pod "dns-default-9vkww" (UID: "aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9") : secret "dns-default-metrics-tls" not found Apr 16 15:12:05.273488 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273227 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert podName:a7946e2b-4899-4e79-8237-f8184b28abd7 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:06.2732149 +0000 UTC m=+35.302786261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tzrgf" (UID: "a7946e2b-4899-4e79-8237-f8184b28abd7") : secret "networking-console-plugin-cert" not found Apr 16 15:12:05.273488 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273242 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls podName:4ca21b1b-b57f-49b7-9334-fd912b40553d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:06.273233805 +0000 UTC m=+35.302805168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls") pod "image-registry-69778bc578-j9plz" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d") : secret "image-registry-tls" not found Apr 16 15:12:05.273488 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:05.273257 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft podName:bdec121d-e73f-477e-a6d0-1678f02e535b nodeName:}" failed. No retries permitted until 2026-04-16 15:12:37.273249582 +0000 UTC m=+66.302820947 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-78lft" (UniqueName: "kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft") pod "network-check-target-29h6w" (UID: "bdec121d-e73f-477e-a6d0-1678f02e535b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:05.530176 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.530109 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:12:05.530316 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.530296 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:12:05.536674 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.536514 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 15:12:05.536674 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.536514 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 15:12:05.537049 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.537028 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xkvgm\"" Apr 16 15:12:05.537622 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:05.537609 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 15:12:06.282265 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:06.282228 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:06.282291 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:06.282327 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:06.282347 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:06.282399 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:06.282426 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:06.282444 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:06.282464 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:06.282481 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69778bc578-j9plz: secret "image-registry-tls" not found Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:06.282469 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert podName:a7946e2b-4899-4e79-8237-f8184b28abd7 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:08.282455878 +0000 UTC m=+37.312027247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tzrgf" (UID: "a7946e2b-4899-4e79-8237-f8184b28abd7") : secret "networking-console-plugin-cert" not found Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:06.282547 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls podName:aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:08.282530736 +0000 UTC m=+37.312102106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls") pod "dns-default-9vkww" (UID: "aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9") : secret "dns-default-metrics-tls" not found Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:06.282559 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert podName:21ef67e8-6503-4ba6-b6ed-bc1016b3958d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:08.282553141 +0000 UTC m=+37.312124497 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert") pod "ingress-canary-6cpbg" (UID: "21ef67e8-6503-4ba6-b6ed-bc1016b3958d") : secret "canary-serving-cert" not found Apr 16 15:12:06.283039 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:06.282573 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls podName:4ca21b1b-b57f-49b7-9334-fd912b40553d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:08.282567629 +0000 UTC m=+37.312138985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls") pod "image-registry-69778bc578-j9plz" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d") : secret "image-registry-tls" not found Apr 16 15:12:06.681848 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:06.681806 2567 generic.go:358] "Generic (PLEG): container finished" podID="c25330d1-d516-4851-8786-0d9a8e235f7d" containerID="c933b71ffd57615ad005294073d14afc5647cd166d2c39a552fbd374234578cf" exitCode=0 Apr 16 15:12:06.682020 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:06.681884 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7kth7" event={"ID":"c25330d1-d516-4851-8786-0d9a8e235f7d","Type":"ContainerDied","Data":"c933b71ffd57615ad005294073d14afc5647cd166d2c39a552fbd374234578cf"} Apr 16 15:12:07.686048 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:07.686014 2567 generic.go:358] "Generic (PLEG): container finished" podID="c25330d1-d516-4851-8786-0d9a8e235f7d" containerID="ef6b42007d80e24849a027b5d0b94bcceada9a2fe75e62dd651e1ca6d11b323d" exitCode=0 Apr 16 15:12:07.686370 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:07.686070 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7kth7" event={"ID":"c25330d1-d516-4851-8786-0d9a8e235f7d","Type":"ContainerDied","Data":"ef6b42007d80e24849a027b5d0b94bcceada9a2fe75e62dd651e1ca6d11b323d"} Apr 16 15:12:08.298015 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:08.297808 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:08.298185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:08.298053 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:08.298185 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:08.297977 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:08.298185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:08.298095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:08.298185 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:08.298115 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:08.298185 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:08.298149 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls podName:aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:12.298127699 +0000 UTC m=+41.327699072 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls") pod "dns-default-9vkww" (UID: "aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9") : secret "dns-default-metrics-tls" not found Apr 16 15:12:08.298426 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:08.298204 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 15:12:08.298426 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:08.298245 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert podName:a7946e2b-4899-4e79-8237-f8184b28abd7 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:12.298232409 +0000 UTC m=+41.327803766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tzrgf" (UID: "a7946e2b-4899-4e79-8237-f8184b28abd7") : secret "networking-console-plugin-cert" not found Apr 16 15:12:08.298426 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:08.298256 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:12:08.298426 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:08.298275 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69778bc578-j9plz: secret "image-registry-tls" not found Apr 16 15:12:08.298426 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:08.298296 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:08.298426 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:08.298325 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls podName:4ca21b1b-b57f-49b7-9334-fd912b40553d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:12.298310529 +0000 UTC m=+41.327881905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls") pod "image-registry-69778bc578-j9plz" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d") : secret "image-registry-tls" not found Apr 16 15:12:08.298426 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:08.298343 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert podName:21ef67e8-6503-4ba6-b6ed-bc1016b3958d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:12.298333773 +0000 UTC m=+41.327905137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert") pod "ingress-canary-6cpbg" (UID: "21ef67e8-6503-4ba6-b6ed-bc1016b3958d") : secret "canary-serving-cert" not found Apr 16 15:12:08.690333 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:08.690307 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7kth7" event={"ID":"c25330d1-d516-4851-8786-0d9a8e235f7d","Type":"ContainerStarted","Data":"04d3bced8028a7b8006afbbeb1601482e22a7017e60f4a50d515d16b5832cbe2"} Apr 16 15:12:08.772875 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:08.772830 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7kth7" podStartSLOduration=6.398470285 podStartE2EDuration="37.772815923s" podCreationTimestamp="2026-04-16 15:11:31 +0000 UTC" firstStartedPulling="2026-04-16 15:11:34.218539602 +0000 UTC m=+3.248110958" lastFinishedPulling="2026-04-16 15:12:05.59288524 +0000 UTC m=+34.622456596" observedRunningTime="2026-04-16 15:12:08.771966912 +0000 UTC m=+37.801538291" watchObservedRunningTime="2026-04-16 15:12:08.772815923 +0000 UTC m=+37.802387301" Apr 16 15:12:12.326920 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:12.326880 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:12.326920 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:12.326922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:12.326975 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:12.327016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:12.327032 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:12.327058 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69778bc578-j9plz: secret "image-registry-tls" not found Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:12.327104 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:12.327109 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:12.327127 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls podName:4ca21b1b-b57f-49b7-9334-fd912b40553d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:20.327106069 +0000 UTC m=+49.356677426 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls") pod "image-registry-69778bc578-j9plz" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d") : secret "image-registry-tls" not found Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:12.327104 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:12.327148 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls podName:aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:20.327136064 +0000 UTC m=+49.356707432 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls") pod "dns-default-9vkww" (UID: "aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9") : secret "dns-default-metrics-tls" not found Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:12.327165 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert podName:21ef67e8-6503-4ba6-b6ed-bc1016b3958d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:20.327156975 +0000 UTC m=+49.356728334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert") pod "ingress-canary-6cpbg" (UID: "21ef67e8-6503-4ba6-b6ed-bc1016b3958d") : secret "canary-serving-cert" not found Apr 16 15:12:12.327349 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:12.327178 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert podName:a7946e2b-4899-4e79-8237-f8184b28abd7 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:20.327170844 +0000 UTC m=+49.356742210 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tzrgf" (UID: "a7946e2b-4899-4e79-8237-f8184b28abd7") : secret "networking-console-plugin-cert" not found Apr 16 15:12:18.564091 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.564060 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg"] Apr 16 15:12:18.608581 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.608556 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m"] Apr 16 15:12:18.608731 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.608714 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" Apr 16 15:12:18.618369 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.618344 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 15:12:18.618533 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.618405 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 15:12:18.619675 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.619658 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-wzpp7\"" Apr 16 15:12:18.619675 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.619667 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 15:12:18.619839 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.619659 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 15:12:18.627386 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.627371 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg"] Apr 16 15:12:18.627450 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.627390 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m"] Apr 16 15:12:18.627450 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.627401 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg"] Apr 16 15:12:18.627511 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.627498 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.635240 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.635221 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 15:12:18.644139 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.644125 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.649384 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.649368 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg"] Apr 16 15:12:18.650335 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.650316 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 15:12:18.650431 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.650355 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 15:12:18.650633 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.650619 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 15:12:18.650695 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.650621 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 15:12:18.674356 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674327 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-hub\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.674476 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674393 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47c45c4f-08fe-4fbc-a5fc-55648da79523-tmp\") pod \"klusterlet-addon-workmgr-5556954d54-lvw7m\" (UID: \"47c45c4f-08fe-4fbc-a5fc-55648da79523\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.674476 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674425 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vzql\" (UniqueName: \"kubernetes.io/projected/176ae89a-b65b-4a60-af6c-e66854cdd99f-kube-api-access-9vzql\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.674476 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674454 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghd7v\" (UniqueName: \"kubernetes.io/projected/44aa1e7d-1e7f-4753-bb3c-81689bd10736-kube-api-access-ghd7v\") pod \"managed-serviceaccount-addon-agent-b856f5757-77lmg\" (UID: \"44aa1e7d-1e7f-4753-bb3c-81689bd10736\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" Apr 16 15:12:18.674616 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674496 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-ca\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.674616 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674516 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/47c45c4f-08fe-4fbc-a5fc-55648da79523-klusterlet-config\") pod \"klusterlet-addon-workmgr-5556954d54-lvw7m\" (UID: \"47c45c4f-08fe-4fbc-a5fc-55648da79523\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.674616 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674571 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/44aa1e7d-1e7f-4753-bb3c-81689bd10736-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b856f5757-77lmg\" (UID: \"44aa1e7d-1e7f-4753-bb3c-81689bd10736\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" Apr 16 15:12:18.674725 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674626 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtgg5\" (UniqueName: \"kubernetes.io/projected/47c45c4f-08fe-4fbc-a5fc-55648da79523-kube-api-access-vtgg5\") pod \"klusterlet-addon-workmgr-5556954d54-lvw7m\" (UID: \"47c45c4f-08fe-4fbc-a5fc-55648da79523\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.674725 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674643 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.674725 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674661 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.674725 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.674678 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/176ae89a-b65b-4a60-af6c-e66854cdd99f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.775071 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775038 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-hub\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.775221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775117 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47c45c4f-08fe-4fbc-a5fc-55648da79523-tmp\") pod \"klusterlet-addon-workmgr-5556954d54-lvw7m\" (UID: \"47c45c4f-08fe-4fbc-a5fc-55648da79523\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.775221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vzql\" (UniqueName: \"kubernetes.io/projected/176ae89a-b65b-4a60-af6c-e66854cdd99f-kube-api-access-9vzql\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.775221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775170 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghd7v\" (UniqueName: \"kubernetes.io/projected/44aa1e7d-1e7f-4753-bb3c-81689bd10736-kube-api-access-ghd7v\") pod \"managed-serviceaccount-addon-agent-b856f5757-77lmg\" (UID: \"44aa1e7d-1e7f-4753-bb3c-81689bd10736\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" Apr 16 15:12:18.775416 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775310 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-ca\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.775416 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/47c45c4f-08fe-4fbc-a5fc-55648da79523-klusterlet-config\") pod \"klusterlet-addon-workmgr-5556954d54-lvw7m\" (UID: \"47c45c4f-08fe-4fbc-a5fc-55648da79523\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.775416 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775385 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/44aa1e7d-1e7f-4753-bb3c-81689bd10736-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b856f5757-77lmg\" (UID: \"44aa1e7d-1e7f-4753-bb3c-81689bd10736\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" Apr 16 15:12:18.775582 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775437 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtgg5\" (UniqueName: \"kubernetes.io/projected/47c45c4f-08fe-4fbc-a5fc-55648da79523-kube-api-access-vtgg5\") pod \"klusterlet-addon-workmgr-5556954d54-lvw7m\" (UID: \"47c45c4f-08fe-4fbc-a5fc-55648da79523\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.775582 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775461 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.775582 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775490 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.775582 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/176ae89a-b65b-4a60-af6c-e66854cdd99f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.775881 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.775857 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47c45c4f-08fe-4fbc-a5fc-55648da79523-tmp\") pod \"klusterlet-addon-workmgr-5556954d54-lvw7m\" (UID: \"47c45c4f-08fe-4fbc-a5fc-55648da79523\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.776459 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.776435 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/176ae89a-b65b-4a60-af6c-e66854cdd99f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.778574 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.778551 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-ca\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.778694 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.778550 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-hub\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.778694 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.778621 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.778760 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.778715 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/176ae89a-b65b-4a60-af6c-e66854cdd99f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.779206 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.779191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/44aa1e7d-1e7f-4753-bb3c-81689bd10736-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b856f5757-77lmg\" (UID: \"44aa1e7d-1e7f-4753-bb3c-81689bd10736\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" Apr 16 15:12:18.779292 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.779273 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/47c45c4f-08fe-4fbc-a5fc-55648da79523-klusterlet-config\") pod \"klusterlet-addon-workmgr-5556954d54-lvw7m\" (UID: \"47c45c4f-08fe-4fbc-a5fc-55648da79523\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.809944 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.809902 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghd7v\" (UniqueName: \"kubernetes.io/projected/44aa1e7d-1e7f-4753-bb3c-81689bd10736-kube-api-access-ghd7v\") pod \"managed-serviceaccount-addon-agent-b856f5757-77lmg\" (UID: \"44aa1e7d-1e7f-4753-bb3c-81689bd10736\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" Apr 16 15:12:18.810098 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.809907 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vzql\" (UniqueName: \"kubernetes.io/projected/176ae89a-b65b-4a60-af6c-e66854cdd99f-kube-api-access-9vzql\") pod \"cluster-proxy-proxy-agent-749c984784-rznpg\" (UID: \"176ae89a-b65b-4a60-af6c-e66854cdd99f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:18.810098 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.809988 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtgg5\" (UniqueName: \"kubernetes.io/projected/47c45c4f-08fe-4fbc-a5fc-55648da79523-kube-api-access-vtgg5\") pod \"klusterlet-addon-workmgr-5556954d54-lvw7m\" (UID: \"47c45c4f-08fe-4fbc-a5fc-55648da79523\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.930354 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.930281 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" Apr 16 15:12:18.937057 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.937024 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:18.952668 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:18.952642 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:12:19.146126 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.146102 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m"] Apr 16 15:12:19.150806 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:12:19.150778 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47c45c4f_08fe_4fbc_a5fc_55648da79523.slice/crio-bf6b11d6022cbfb30927e689e299ce5f5089b424adaafe87503d791462db5aa3 WatchSource:0}: Error finding container bf6b11d6022cbfb30927e689e299ce5f5089b424adaafe87503d791462db5aa3: Status 404 returned error can't find the container with id bf6b11d6022cbfb30927e689e299ce5f5089b424adaafe87503d791462db5aa3 Apr 16 15:12:19.345109 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.345079 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg"] Apr 16 15:12:19.349521 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:12:19.349452 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44aa1e7d_1e7f_4753_bb3c_81689bd10736.slice/crio-41dbfcfe732915697e5500f01023fcf657fcc1e04a2d549656765341a88174b3 WatchSource:0}: Error finding container 41dbfcfe732915697e5500f01023fcf657fcc1e04a2d549656765341a88174b3: Status 404 returned error can't find the container with id 41dbfcfe732915697e5500f01023fcf657fcc1e04a2d549656765341a88174b3 Apr 16 15:12:19.352433 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.352408 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg"] Apr 16 15:12:19.352864 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:12:19.352840 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod176ae89a_b65b_4a60_af6c_e66854cdd99f.slice/crio-bb3b257585766d34889748136acb0131bac30cc2fc0389f1d5bcc951e5e11d10 WatchSource:0}: Error finding container bb3b257585766d34889748136acb0131bac30cc2fc0389f1d5bcc951e5e11d10: Status 404 returned error can't find the container with id bb3b257585766d34889748136acb0131bac30cc2fc0389f1d5bcc951e5e11d10 Apr 16 15:12:19.581766 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.581737 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:12:19.584170 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.584151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52595484-a093-4a5b-8052-226e00ba9507-original-pull-secret\") pod \"global-pull-secret-syncer-pg2mn\" (UID: \"52595484-a093-4a5b-8052-226e00ba9507\") " pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:12:19.640323 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.640267 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pg2mn" Apr 16 15:12:19.711847 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.711818 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" event={"ID":"47c45c4f-08fe-4fbc-a5fc-55648da79523","Type":"ContainerStarted","Data":"bf6b11d6022cbfb30927e689e299ce5f5089b424adaafe87503d791462db5aa3"} Apr 16 15:12:19.712984 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.712957 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" event={"ID":"44aa1e7d-1e7f-4753-bb3c-81689bd10736","Type":"ContainerStarted","Data":"41dbfcfe732915697e5500f01023fcf657fcc1e04a2d549656765341a88174b3"} Apr 16 15:12:19.713989 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.713968 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" event={"ID":"176ae89a-b65b-4a60-af6c-e66854cdd99f","Type":"ContainerStarted","Data":"bb3b257585766d34889748136acb0131bac30cc2fc0389f1d5bcc951e5e11d10"} Apr 16 15:12:19.770136 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:19.770113 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pg2mn"] Apr 16 15:12:19.772204 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:12:19.772173 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52595484_a093_4a5b_8052_226e00ba9507.slice/crio-6824ab252908ddfe0526d8c6837961ac8803a6a9e367f5bee20701e2ea9cf2ba WatchSource:0}: Error finding container 6824ab252908ddfe0526d8c6837961ac8803a6a9e367f5bee20701e2ea9cf2ba: Status 404 returned error can't find the container with id 6824ab252908ddfe0526d8c6837961ac8803a6a9e367f5bee20701e2ea9cf2ba Apr 16 15:12:20.386406 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:20.386369 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:20.386721 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:20.386452 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:20.386721 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:20.386499 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:20.386721 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:20.386535 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:20.386721 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:20.386714 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:20.386992 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:20.386775 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls podName:aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:36.386756117 +0000 UTC m=+65.416327493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls") pod "dns-default-9vkww" (UID: "aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9") : secret "dns-default-metrics-tls" not found Apr 16 15:12:20.386992 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:20.386835 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:20.386992 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:20.386867 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert podName:21ef67e8-6503-4ba6-b6ed-bc1016b3958d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:36.386856228 +0000 UTC m=+65.416427590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert") pod "ingress-canary-6cpbg" (UID: "21ef67e8-6503-4ba6-b6ed-bc1016b3958d") : secret "canary-serving-cert" not found Apr 16 15:12:20.386992 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:20.386941 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 15:12:20.386992 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:20.386976 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert podName:a7946e2b-4899-4e79-8237-f8184b28abd7 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:36.386964008 +0000 UTC m=+65.416535368 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tzrgf" (UID: "a7946e2b-4899-4e79-8237-f8184b28abd7") : secret "networking-console-plugin-cert" not found Apr 16 15:12:20.387254 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:20.387038 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:12:20.387254 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:20.387050 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69778bc578-j9plz: secret "image-registry-tls" not found Apr 16 15:12:20.387254 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:20.387085 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls podName:4ca21b1b-b57f-49b7-9334-fd912b40553d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:36.387073665 +0000 UTC m=+65.416645027 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls") pod "image-registry-69778bc578-j9plz" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d") : secret "image-registry-tls" not found Apr 16 15:12:20.717768 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:20.717684 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pg2mn" event={"ID":"52595484-a093-4a5b-8052-226e00ba9507","Type":"ContainerStarted","Data":"6824ab252908ddfe0526d8c6837961ac8803a6a9e367f5bee20701e2ea9cf2ba"} Apr 16 15:12:25.729616 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:25.729582 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" event={"ID":"44aa1e7d-1e7f-4753-bb3c-81689bd10736","Type":"ContainerStarted","Data":"4ea5ed2873ef653815ccd73e80265a7bf4820179ad025ee6faa3e65f913b0608"} Apr 16 15:12:25.730876 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:25.730855 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pg2mn" event={"ID":"52595484-a093-4a5b-8052-226e00ba9507","Type":"ContainerStarted","Data":"7f6d6e8d0926daac45629ea72184e164e266d203d2d4d76fd4eb5fd69fe8f47b"} Apr 16 15:12:25.732109 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:25.732089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" event={"ID":"47c45c4f-08fe-4fbc-a5fc-55648da79523","Type":"ContainerStarted","Data":"4f152987fe81a2f1ddac3204a3a10197a250dc4c0bd0780aaf4082161f6ba6bc"} Apr 16 15:12:25.732304 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:25.732287 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:25.733699 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:25.733682 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:12:25.764476 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:25.764435 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" podStartSLOduration=1.862242264 podStartE2EDuration="7.764423575s" podCreationTimestamp="2026-04-16 15:12:18 +0000 UTC" firstStartedPulling="2026-04-16 15:12:19.3517065 +0000 UTC m=+48.381277856" lastFinishedPulling="2026-04-16 15:12:25.253887797 +0000 UTC m=+54.283459167" observedRunningTime="2026-04-16 15:12:25.762626854 +0000 UTC m=+54.792198245" watchObservedRunningTime="2026-04-16 15:12:25.764423575 +0000 UTC m=+54.793994953" Apr 16 15:12:25.811012 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:25.810968 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pg2mn" podStartSLOduration=33.319221125 podStartE2EDuration="38.810954706s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:12:19.773987121 +0000 UTC m=+48.803558478" lastFinishedPulling="2026-04-16 15:12:25.265720701 +0000 UTC m=+54.295292059" observedRunningTime="2026-04-16 15:12:25.803905516 +0000 UTC m=+54.833476895" watchObservedRunningTime="2026-04-16 15:12:25.810954706 +0000 UTC m=+54.840526137" Apr 16 15:12:36.408937 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:36.408883 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:36.408967 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:36.408996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:36.409021 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:36.409021 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:36.409065 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:36.409081 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69778bc578-j9plz: secret "image-registry-tls" not found Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:36.409108 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:36.409118 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert podName:21ef67e8-6503-4ba6-b6ed-bc1016b3958d nodeName:}" failed. No retries permitted until 2026-04-16 15:13:08.4091005 +0000 UTC m=+97.438671856 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert") pod "ingress-canary-6cpbg" (UID: "21ef67e8-6503-4ba6-b6ed-bc1016b3958d") : secret "canary-serving-cert" not found Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:36.409131 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:36.409139 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls podName:4ca21b1b-b57f-49b7-9334-fd912b40553d nodeName:}" failed. No retries permitted until 2026-04-16 15:13:08.409126908 +0000 UTC m=+97.438698270 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls") pod "image-registry-69778bc578-j9plz" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d") : secret "image-registry-tls" not found Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:36.409201 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert podName:a7946e2b-4899-4e79-8237-f8184b28abd7 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:08.409182368 +0000 UTC m=+97.438753727 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tzrgf" (UID: "a7946e2b-4899-4e79-8237-f8184b28abd7") : secret "networking-console-plugin-cert" not found Apr 16 15:12:36.409377 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:36.409220 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls podName:aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:08.409209065 +0000 UTC m=+97.438780424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls") pod "dns-default-9vkww" (UID: "aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9") : secret "dns-default-metrics-tls" not found Apr 16 15:12:37.213850 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.213809 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:12:37.214042 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:37.213964 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 15:12:37.214042 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:12:37.214029 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs podName:ba267359-2c95-4792-991e-a2e9eae5b290 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:41.214011411 +0000 UTC m=+130.243582772 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs") pod "network-metrics-daemon-x8njb" (UID: "ba267359-2c95-4792-991e-a2e9eae5b290") : secret "metrics-daemon-secret" not found Apr 16 15:12:37.314516 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.314476 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:12:37.318194 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.318176 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 15:12:37.327549 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.327530 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 15:12:37.337633 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.337605 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78lft\" (UniqueName: \"kubernetes.io/projected/bdec121d-e73f-477e-a6d0-1678f02e535b-kube-api-access-78lft\") pod \"network-check-target-29h6w\" (UID: \"bdec121d-e73f-477e-a6d0-1678f02e535b\") " pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:12:37.350111 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.350092 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xkvgm\"" Apr 16 15:12:37.358055 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.358040 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:12:37.494063 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.491908 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" podStartSLOduration=13.375507777 podStartE2EDuration="19.491831746s" podCreationTimestamp="2026-04-16 15:12:18 +0000 UTC" firstStartedPulling="2026-04-16 15:12:19.152491519 +0000 UTC m=+48.182062875" lastFinishedPulling="2026-04-16 15:12:25.268815477 +0000 UTC m=+54.298386844" observedRunningTime="2026-04-16 15:12:25.845354471 +0000 UTC m=+54.874925848" watchObservedRunningTime="2026-04-16 15:12:37.491831746 +0000 UTC m=+66.521403126" Apr 16 15:12:37.494063 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.492625 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-29h6w"] Apr 16 15:12:37.496128 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:12:37.496096 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdec121d_e73f_477e_a6d0_1678f02e535b.slice/crio-77062b7f5836bbc9ea3c8fb06b3141d7b81daaa2303b380c4e254abec0199dcd WatchSource:0}: Error finding container 77062b7f5836bbc9ea3c8fb06b3141d7b81daaa2303b380c4e254abec0199dcd: Status 404 returned error can't find the container with id 77062b7f5836bbc9ea3c8fb06b3141d7b81daaa2303b380c4e254abec0199dcd Apr 16 15:12:37.756340 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:37.756258 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-29h6w" event={"ID":"bdec121d-e73f-477e-a6d0-1678f02e535b","Type":"ContainerStarted","Data":"77062b7f5836bbc9ea3c8fb06b3141d7b81daaa2303b380c4e254abec0199dcd"} Apr 16 15:12:40.763813 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:40.763770 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-29h6w" event={"ID":"bdec121d-e73f-477e-a6d0-1678f02e535b","Type":"ContainerStarted","Data":"b2cb806e543772b3a218181fcc0bfa638dc017b3cdd44dc34c95e8252baf10d5"} Apr 16 15:12:40.764288 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:40.763943 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:12:40.788433 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:40.788387 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-29h6w" podStartSLOduration=67.075907495 podStartE2EDuration="1m9.788375549s" podCreationTimestamp="2026-04-16 15:11:31 +0000 UTC" firstStartedPulling="2026-04-16 15:12:37.49812239 +0000 UTC m=+66.527693750" lastFinishedPulling="2026-04-16 15:12:40.210590438 +0000 UTC m=+69.240161804" observedRunningTime="2026-04-16 15:12:40.787063687 +0000 UTC m=+69.816635066" watchObservedRunningTime="2026-04-16 15:12:40.788375549 +0000 UTC m=+69.817946927" Apr 16 15:12:42.770528 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:42.770487 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" event={"ID":"176ae89a-b65b-4a60-af6c-e66854cdd99f","Type":"ContainerStarted","Data":"e53c959753322c0c3a606dec15c3c651ff7962cfd7b4ad15911bb2aabeffe9c0"} Apr 16 15:12:44.777018 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:44.776984 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" event={"ID":"176ae89a-b65b-4a60-af6c-e66854cdd99f","Type":"ContainerStarted","Data":"ef80a767b64d09dcb29fcfdeae99808f5baa2c19dd1b30bd248cbcee4573920a"} Apr 16 15:12:44.777018 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:44.777017 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" event={"ID":"176ae89a-b65b-4a60-af6c-e66854cdd99f","Type":"ContainerStarted","Data":"911dd7e4929cb7c9065518570b43567aa8bbebae26b8c23691405ba6e624ab5e"} Apr 16 15:12:44.807044 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:12:44.806998 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" podStartSLOduration=2.160262442 podStartE2EDuration="26.806983957s" podCreationTimestamp="2026-04-16 15:12:18 +0000 UTC" firstStartedPulling="2026-04-16 15:12:19.354467473 +0000 UTC m=+48.384038829" lastFinishedPulling="2026-04-16 15:12:44.001188977 +0000 UTC m=+73.030760344" observedRunningTime="2026-04-16 15:12:44.806464806 +0000 UTC m=+73.836036209" watchObservedRunningTime="2026-04-16 15:12:44.806983957 +0000 UTC m=+73.836555334" Apr 16 15:13:08.457667 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:13:08.457535 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:13:08.457667 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:13:08.457575 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:13:08.457667 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:13:08.457599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:13:08.457667 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:13:08.457619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:13:08.458263 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:08.457707 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:13:08.458263 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:08.457708 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 15:13:08.458263 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:08.457767 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls podName:aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:12.457749909 +0000 UTC m=+161.487321266 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls") pod "dns-default-9vkww" (UID: "aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9") : secret "dns-default-metrics-tls" not found Apr 16 15:13:08.458263 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:08.457711 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:13:08.458263 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:08.457708 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:13:08.458263 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:08.457783 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert podName:a7946e2b-4899-4e79-8237-f8184b28abd7 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:12.457775252 +0000 UTC m=+161.487346609 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tzrgf" (UID: "a7946e2b-4899-4e79-8237-f8184b28abd7") : secret "networking-console-plugin-cert" not found Apr 16 15:13:08.458263 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:08.457786 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69778bc578-j9plz: secret "image-registry-tls" not found Apr 16 15:13:08.458263 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:08.457856 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert podName:21ef67e8-6503-4ba6-b6ed-bc1016b3958d nodeName:}" failed. No retries permitted until 2026-04-16 15:14:12.457838379 +0000 UTC m=+161.487409749 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert") pod "ingress-canary-6cpbg" (UID: "21ef67e8-6503-4ba6-b6ed-bc1016b3958d") : secret "canary-serving-cert" not found Apr 16 15:13:08.458263 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:08.457962 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls podName:4ca21b1b-b57f-49b7-9334-fd912b40553d nodeName:}" failed. No retries permitted until 2026-04-16 15:14:12.457942136 +0000 UTC m=+161.487513541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls") pod "image-registry-69778bc578-j9plz" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d") : secret "image-registry-tls" not found Apr 16 15:13:11.767663 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:13:11.767635 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-29h6w" Apr 16 15:13:41.308024 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:13:41.307984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:13:41.308499 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:41.308130 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 15:13:41.308499 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:13:41.308193 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs podName:ba267359-2c95-4792-991e-a2e9eae5b290 nodeName:}" failed. No retries permitted until 2026-04-16 15:15:43.308176874 +0000 UTC m=+252.337748231 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs") pod "network-metrics-daemon-x8njb" (UID: "ba267359-2c95-4792-991e-a2e9eae5b290") : secret "metrics-daemon-secret" not found Apr 16 15:14:07.509497 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:07.509451 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-69778bc578-j9plz" podUID="4ca21b1b-b57f-49b7-9334-fd912b40553d" Apr 16 15:14:07.524621 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:07.524590 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" podUID="a7946e2b-4899-4e79-8237-f8184b28abd7" Apr 16 15:14:07.554643 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:07.554603 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9vkww" podUID="aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9" Apr 16 15:14:07.562817 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:07.562800 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-x8njb" podUID="ba267359-2c95-4792-991e-a2e9eae5b290" Apr 16 15:14:07.571999 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:07.571976 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6cpbg" podUID="21ef67e8-6503-4ba6-b6ed-bc1016b3958d" Apr 16 15:14:07.972064 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:07.972034 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9vkww" Apr 16 15:14:07.972247 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:07.972033 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:14:07.972247 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:07.972033 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:14:07.972368 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:07.972039 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:14:12.536855 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:12.536803 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:14:12.536855 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:12.536857 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:12.536882 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") pod \"image-registry-69778bc578-j9plz\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:12.536902 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:12.536969 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:12.537006 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:12.537042 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert podName:21ef67e8-6503-4ba6-b6ed-bc1016b3958d nodeName:}" failed. No retries permitted until 2026-04-16 15:16:14.537024826 +0000 UTC m=+283.566596182 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert") pod "ingress-canary-6cpbg" (UID: "21ef67e8-6503-4ba6-b6ed-bc1016b3958d") : secret "canary-serving-cert" not found Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:12.537056 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:12.537069 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert podName:a7946e2b-4899-4e79-8237-f8184b28abd7 nodeName:}" failed. No retries permitted until 2026-04-16 15:16:14.53705496 +0000 UTC m=+283.566626315 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tzrgf" (UID: "a7946e2b-4899-4e79-8237-f8184b28abd7") : secret "networking-console-plugin-cert" not found Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:12.537077 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69778bc578-j9plz: secret "image-registry-tls" not found Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:12.537064 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:12.537133 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls podName:4ca21b1b-b57f-49b7-9334-fd912b40553d nodeName:}" failed. No retries permitted until 2026-04-16 15:16:14.53711755 +0000 UTC m=+283.566688914 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls") pod "image-registry-69778bc578-j9plz" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d") : secret "image-registry-tls" not found Apr 16 15:14:12.537371 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:12.537173 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls podName:aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9 nodeName:}" failed. No retries permitted until 2026-04-16 15:16:14.537163398 +0000 UTC m=+283.566734754 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls") pod "dns-default-9vkww" (UID: "aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9") : secret "dns-default-metrics-tls" not found Apr 16 15:14:14.728981 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:14.728952 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-j9vwn_adf454f4-3a18-4824-b7ac-7736800ea721/dns-node-resolver/0.log" Apr 16 15:14:15.928833 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:15.928806 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h97tg_2da4fddf-5318-4d67-9672-73870158cdf2/node-ca/0.log" Apr 16 15:14:21.530962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:21.530868 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:14:25.733242 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:25.733133 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" podUID="47c45c4f-08fe-4fbc-a5fc-55648da79523" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.11:8000/readyz\": dial tcp 10.132.0.11:8000: connect: connection refused" Apr 16 15:14:26.019991 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:26.019890 2567 generic.go:358] "Generic (PLEG): container finished" podID="44aa1e7d-1e7f-4753-bb3c-81689bd10736" containerID="4ea5ed2873ef653815ccd73e80265a7bf4820179ad025ee6faa3e65f913b0608" exitCode=255 Apr 16 15:14:26.019991 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:26.019963 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" event={"ID":"44aa1e7d-1e7f-4753-bb3c-81689bd10736","Type":"ContainerDied","Data":"4ea5ed2873ef653815ccd73e80265a7bf4820179ad025ee6faa3e65f913b0608"} Apr 16 15:14:26.020326 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:26.020305 2567 scope.go:117] "RemoveContainer" containerID="4ea5ed2873ef653815ccd73e80265a7bf4820179ad025ee6faa3e65f913b0608" Apr 16 15:14:26.021205 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:26.021188 2567 generic.go:358] "Generic (PLEG): container finished" podID="47c45c4f-08fe-4fbc-a5fc-55648da79523" containerID="4f152987fe81a2f1ddac3204a3a10197a250dc4c0bd0780aaf4082161f6ba6bc" exitCode=1 Apr 16 15:14:26.021264 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:26.021220 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" event={"ID":"47c45c4f-08fe-4fbc-a5fc-55648da79523","Type":"ContainerDied","Data":"4f152987fe81a2f1ddac3204a3a10197a250dc4c0bd0780aaf4082161f6ba6bc"} Apr 16 15:14:26.021521 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:26.021505 2567 scope.go:117] "RemoveContainer" containerID="4f152987fe81a2f1ddac3204a3a10197a250dc4c0bd0780aaf4082161f6ba6bc" Apr 16 15:14:27.025014 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:27.024981 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b856f5757-77lmg" event={"ID":"44aa1e7d-1e7f-4753-bb3c-81689bd10736","Type":"ContainerStarted","Data":"96c4bfb4d682a68155a2c353fdf1cf57298fa1b386b3bb0156aeb9171accb631"} Apr 16 15:14:27.026373 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:27.026353 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" event={"ID":"47c45c4f-08fe-4fbc-a5fc-55648da79523","Type":"ContainerStarted","Data":"296bed3d6d579086ace3c7afb93a8b8638a753af2dfdaa68739462d6f43c3a4e"} Apr 16 15:14:27.026613 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:27.026589 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:14:27.027173 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:27.027157 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5556954d54-lvw7m" Apr 16 15:14:32.024475 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.024442 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2lhkw"] Apr 16 15:14:32.027630 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.027614 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.030397 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.030374 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 15:14:32.030514 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.030375 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 15:14:32.031536 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.031519 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 15:14:32.031631 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.031617 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 15:14:32.031715 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.031701 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wvcg7\"" Apr 16 15:14:32.041688 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.041666 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2lhkw"] Apr 16 15:14:32.189604 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.189563 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/423116e6-4cda-4fd6-8ffa-c21a25175327-data-volume\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.189756 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.189612 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/423116e6-4cda-4fd6-8ffa-c21a25175327-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.189756 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.189634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/423116e6-4cda-4fd6-8ffa-c21a25175327-crio-socket\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.189756 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.189718 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pckhw\" (UniqueName: \"kubernetes.io/projected/423116e6-4cda-4fd6-8ffa-c21a25175327-kube-api-access-pckhw\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.189864 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.189781 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/423116e6-4cda-4fd6-8ffa-c21a25175327-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.290580 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.290495 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pckhw\" (UniqueName: \"kubernetes.io/projected/423116e6-4cda-4fd6-8ffa-c21a25175327-kube-api-access-pckhw\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.290580 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.290577 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/423116e6-4cda-4fd6-8ffa-c21a25175327-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.290790 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.290626 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/423116e6-4cda-4fd6-8ffa-c21a25175327-data-volume\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.290790 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.290662 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/423116e6-4cda-4fd6-8ffa-c21a25175327-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.290790 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.290691 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/423116e6-4cda-4fd6-8ffa-c21a25175327-crio-socket\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.290790 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.290786 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/423116e6-4cda-4fd6-8ffa-c21a25175327-crio-socket\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.291098 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.291078 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/423116e6-4cda-4fd6-8ffa-c21a25175327-data-volume\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.291273 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.291257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/423116e6-4cda-4fd6-8ffa-c21a25175327-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.293366 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.293346 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/423116e6-4cda-4fd6-8ffa-c21a25175327-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.299729 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.299710 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pckhw\" (UniqueName: \"kubernetes.io/projected/423116e6-4cda-4fd6-8ffa-c21a25175327-kube-api-access-pckhw\") pod \"insights-runtime-extractor-2lhkw\" (UID: \"423116e6-4cda-4fd6-8ffa-c21a25175327\") " pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.336709 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.336682 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2lhkw" Apr 16 15:14:32.450145 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:32.450080 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2lhkw"] Apr 16 15:14:32.453665 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:14:32.453637 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod423116e6_4cda_4fd6_8ffa_c21a25175327.slice/crio-02b0f591b668686ddc7fa44baa0a19674ad607110aa4267894d834ff681b078f WatchSource:0}: Error finding container 02b0f591b668686ddc7fa44baa0a19674ad607110aa4267894d834ff681b078f: Status 404 returned error can't find the container with id 02b0f591b668686ddc7fa44baa0a19674ad607110aa4267894d834ff681b078f Apr 16 15:14:33.042090 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:33.042058 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2lhkw" event={"ID":"423116e6-4cda-4fd6-8ffa-c21a25175327","Type":"ContainerStarted","Data":"93e2c3c13132dc0cefb539f330cc679003d9caa3f600917cb4fbc312eb521d65"} Apr 16 15:14:33.042470 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:33.042098 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2lhkw" event={"ID":"423116e6-4cda-4fd6-8ffa-c21a25175327","Type":"ContainerStarted","Data":"02b0f591b668686ddc7fa44baa0a19674ad607110aa4267894d834ff681b078f"} Apr 16 15:14:34.046963 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:34.046916 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2lhkw" event={"ID":"423116e6-4cda-4fd6-8ffa-c21a25175327","Type":"ContainerStarted","Data":"20774dfc673e3e10938ac1a1462df6c7181f329611a989b6e972158da4371300"} Apr 16 15:14:35.051048 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:35.051014 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2lhkw" event={"ID":"423116e6-4cda-4fd6-8ffa-c21a25175327","Type":"ContainerStarted","Data":"dfa4c04eefaba56275aefa399396ce208735376afaecc9de11ba7b8c9a2a6df3"} Apr 16 15:14:35.071617 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:35.071576 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2lhkw" podStartSLOduration=0.752662369 podStartE2EDuration="3.071563828s" podCreationTimestamp="2026-04-16 15:14:32 +0000 UTC" firstStartedPulling="2026-04-16 15:14:32.511210318 +0000 UTC m=+181.540781674" lastFinishedPulling="2026-04-16 15:14:34.830111774 +0000 UTC m=+183.859683133" observedRunningTime="2026-04-16 15:14:35.070690926 +0000 UTC m=+184.100262308" watchObservedRunningTime="2026-04-16 15:14:35.071563828 +0000 UTC m=+184.101135207" Apr 16 15:14:49.044665 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.044630 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-frdt2"] Apr 16 15:14:49.047976 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.047958 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.050333 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.050313 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8z8zc\"" Apr 16 15:14:49.050481 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.050464 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 15:14:49.050569 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.050554 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 15:14:49.051617 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.051598 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 15:14:49.051703 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.051619 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 15:14:49.051703 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.051658 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 15:14:49.051806 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.051738 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 15:14:49.111919 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.111893 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-textfile\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.111919 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.111920 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06d73659-193b-47fd-b466-a7decbc45ca9-metrics-client-ca\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.112103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.111958 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-accelerators-collector-config\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.112103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.111994 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5x7m\" (UniqueName: \"kubernetes.io/projected/06d73659-193b-47fd-b466-a7decbc45ca9-kube-api-access-c5x7m\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.112103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.112014 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.112103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.112053 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/06d73659-193b-47fd-b466-a7decbc45ca9-root\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.112103 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.112083 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06d73659-193b-47fd-b466-a7decbc45ca9-sys\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.112269 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.112110 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-tls\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.112269 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.112171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-wtmp\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213180 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-wtmp\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213284 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-textfile\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213284 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213231 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06d73659-193b-47fd-b466-a7decbc45ca9-metrics-client-ca\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213284 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213253 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-accelerators-collector-config\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213284 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213279 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5x7m\" (UniqueName: \"kubernetes.io/projected/06d73659-193b-47fd-b466-a7decbc45ca9-kube-api-access-c5x7m\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213461 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213305 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213461 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213341 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-wtmp\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213461 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213354 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/06d73659-193b-47fd-b466-a7decbc45ca9-root\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213461 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213390 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/06d73659-193b-47fd-b466-a7decbc45ca9-root\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213461 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213394 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06d73659-193b-47fd-b466-a7decbc45ca9-sys\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213461 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213424 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-tls\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213715 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213497 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06d73659-193b-47fd-b466-a7decbc45ca9-sys\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213715 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213653 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-textfile\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.213989 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213970 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06d73659-193b-47fd-b466-a7decbc45ca9-metrics-client-ca\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.214030 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.213981 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-accelerators-collector-config\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.216409 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.216386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.216503 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.216413 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06d73659-193b-47fd-b466-a7decbc45ca9-node-exporter-tls\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.222827 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.222809 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5x7m\" (UniqueName: \"kubernetes.io/projected/06d73659-193b-47fd-b466-a7decbc45ca9-kube-api-access-c5x7m\") pod \"node-exporter-frdt2\" (UID: \"06d73659-193b-47fd-b466-a7decbc45ca9\") " pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.356684 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:49.356660 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-frdt2" Apr 16 15:14:49.365107 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:14:49.365060 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d73659_193b_47fd_b466_a7decbc45ca9.slice/crio-043189fce25f5fd053b8694e4465f176f35b93812bdab74ab539243d25e74f79 WatchSource:0}: Error finding container 043189fce25f5fd053b8694e4465f176f35b93812bdab74ab539243d25e74f79: Status 404 returned error can't find the container with id 043189fce25f5fd053b8694e4465f176f35b93812bdab74ab539243d25e74f79 Apr 16 15:14:50.087286 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:50.087254 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-frdt2" event={"ID":"06d73659-193b-47fd-b466-a7decbc45ca9","Type":"ContainerStarted","Data":"043189fce25f5fd053b8694e4465f176f35b93812bdab74ab539243d25e74f79"} Apr 16 15:14:51.090897 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:51.090866 2567 generic.go:358] "Generic (PLEG): container finished" podID="06d73659-193b-47fd-b466-a7decbc45ca9" containerID="4c0c5dfa0520cd302f29ce49b20c06e4d20d04d45d72f89a1b20e6e0e1541d85" exitCode=0 Apr 16 15:14:51.091318 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:51.090974 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-frdt2" event={"ID":"06d73659-193b-47fd-b466-a7decbc45ca9","Type":"ContainerDied","Data":"4c0c5dfa0520cd302f29ce49b20c06e4d20d04d45d72f89a1b20e6e0e1541d85"} Apr 16 15:14:52.095395 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:52.095362 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-frdt2" event={"ID":"06d73659-193b-47fd-b466-a7decbc45ca9","Type":"ContainerStarted","Data":"360979510fd74e57cd0354f44e47663445001b1497255da8385a91d40f422d60"} Apr 16 15:14:52.095395 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:52.095398 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-frdt2" event={"ID":"06d73659-193b-47fd-b466-a7decbc45ca9","Type":"ContainerStarted","Data":"3d4042e7ccecc1f8fc3cc6019479cf44a5e49cd7f2d22ceb170f3c77c23e9d21"} Apr 16 15:14:52.117873 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:52.117815 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-frdt2" podStartSLOduration=2.459592731 podStartE2EDuration="3.117800817s" podCreationTimestamp="2026-04-16 15:14:49 +0000 UTC" firstStartedPulling="2026-04-16 15:14:49.366866911 +0000 UTC m=+198.396438269" lastFinishedPulling="2026-04-16 15:14:50.025074996 +0000 UTC m=+199.054646355" observedRunningTime="2026-04-16 15:14:52.11613981 +0000 UTC m=+201.145711190" watchObservedRunningTime="2026-04-16 15:14:52.117800817 +0000 UTC m=+201.147372194" Apr 16 15:14:53.782569 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:53.780026 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69778bc578-j9plz"] Apr 16 15:14:53.782569 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:14:53.780460 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-69778bc578-j9plz" podUID="4ca21b1b-b57f-49b7-9334-fd912b40553d" Apr 16 15:14:54.100774 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.100746 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:14:54.105100 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.105064 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:14:54.151479 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.151451 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-bound-sa-token\") pod \"4ca21b1b-b57f-49b7-9334-fd912b40553d\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " Apr 16 15:14:54.151701 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.151484 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m684l\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-kube-api-access-m684l\") pod \"4ca21b1b-b57f-49b7-9334-fd912b40553d\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " Apr 16 15:14:54.151701 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.151576 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-installation-pull-secrets\") pod \"4ca21b1b-b57f-49b7-9334-fd912b40553d\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " Apr 16 15:14:54.151701 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.151625 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-image-registry-private-configuration\") pod \"4ca21b1b-b57f-49b7-9334-fd912b40553d\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " Apr 16 15:14:54.151701 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.151662 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-certificates\") pod \"4ca21b1b-b57f-49b7-9334-fd912b40553d\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " Apr 16 15:14:54.151955 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.151702 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ca21b1b-b57f-49b7-9334-fd912b40553d-ca-trust-extracted\") pod \"4ca21b1b-b57f-49b7-9334-fd912b40553d\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " Apr 16 15:14:54.151955 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.151733 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-trusted-ca\") pod \"4ca21b1b-b57f-49b7-9334-fd912b40553d\" (UID: \"4ca21b1b-b57f-49b7-9334-fd912b40553d\") " Apr 16 15:14:54.152060 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.152036 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ca21b1b-b57f-49b7-9334-fd912b40553d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4ca21b1b-b57f-49b7-9334-fd912b40553d" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:54.152113 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.152045 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4ca21b1b-b57f-49b7-9334-fd912b40553d" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:14:54.152362 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.152338 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4ca21b1b-b57f-49b7-9334-fd912b40553d" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:14:54.153870 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.153830 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4ca21b1b-b57f-49b7-9334-fd912b40553d" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:14:54.153870 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.153837 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4ca21b1b-b57f-49b7-9334-fd912b40553d" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:14:54.154016 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.153897 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-kube-api-access-m684l" (OuterVolumeSpecName: "kube-api-access-m684l") pod "4ca21b1b-b57f-49b7-9334-fd912b40553d" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d"). InnerVolumeSpecName "kube-api-access-m684l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:14:54.154016 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.153908 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4ca21b1b-b57f-49b7-9334-fd912b40553d" (UID: "4ca21b1b-b57f-49b7-9334-fd912b40553d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:14:54.252596 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.252564 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ca21b1b-b57f-49b7-9334-fd912b40553d-ca-trust-extracted\") on node \"ip-10-0-135-252.ec2.internal\" DevicePath \"\"" Apr 16 15:14:54.252596 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.252589 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-trusted-ca\") on node \"ip-10-0-135-252.ec2.internal\" DevicePath \"\"" Apr 16 15:14:54.252596 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.252599 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-bound-sa-token\") on node \"ip-10-0-135-252.ec2.internal\" DevicePath \"\"" Apr 16 15:14:54.252785 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.252608 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m684l\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-kube-api-access-m684l\") on node \"ip-10-0-135-252.ec2.internal\" DevicePath \"\"" Apr 16 15:14:54.252785 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.252618 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-installation-pull-secrets\") on node \"ip-10-0-135-252.ec2.internal\" DevicePath \"\"" Apr 16 15:14:54.252785 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.252627 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ca21b1b-b57f-49b7-9334-fd912b40553d-image-registry-private-configuration\") on node \"ip-10-0-135-252.ec2.internal\" DevicePath \"\"" Apr 16 15:14:54.252785 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:54.252636 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-certificates\") on node \"ip-10-0-135-252.ec2.internal\" DevicePath \"\"" Apr 16 15:14:55.104252 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:55.103013 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69778bc578-j9plz" Apr 16 15:14:55.144976 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:55.144948 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69778bc578-j9plz"] Apr 16 15:14:55.148912 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:55.148887 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69778bc578-j9plz"] Apr 16 15:14:55.160341 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:55.160319 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ca21b1b-b57f-49b7-9334-fd912b40553d-registry-tls\") on node \"ip-10-0-135-252.ec2.internal\" DevicePath \"\"" Apr 16 15:14:55.532831 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:14:55.532753 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca21b1b-b57f-49b7-9334-fd912b40553d" path="/var/lib/kubelet/pods/4ca21b1b-b57f-49b7-9334-fd912b40553d/volumes" Apr 16 15:15:18.953612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:18.953573 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" podUID="176ae89a-b65b-4a60-af6c-e66854cdd99f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 15:15:28.953692 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:28.953654 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" podUID="176ae89a-b65b-4a60-af6c-e66854cdd99f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 15:15:38.953976 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:38.953914 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" podUID="176ae89a-b65b-4a60-af6c-e66854cdd99f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 15:15:38.954328 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:38.954003 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" Apr 16 15:15:38.954487 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:38.954457 2567 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ef80a767b64d09dcb29fcfdeae99808f5baa2c19dd1b30bd248cbcee4573920a"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 15:15:38.954532 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:38.954518 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" podUID="176ae89a-b65b-4a60-af6c-e66854cdd99f" containerName="service-proxy" containerID="cri-o://ef80a767b64d09dcb29fcfdeae99808f5baa2c19dd1b30bd248cbcee4573920a" gracePeriod=30 Apr 16 15:15:39.209062 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:39.208981 2567 generic.go:358] "Generic (PLEG): container finished" podID="176ae89a-b65b-4a60-af6c-e66854cdd99f" containerID="ef80a767b64d09dcb29fcfdeae99808f5baa2c19dd1b30bd248cbcee4573920a" exitCode=2 Apr 16 15:15:39.209062 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:39.209022 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" event={"ID":"176ae89a-b65b-4a60-af6c-e66854cdd99f","Type":"ContainerDied","Data":"ef80a767b64d09dcb29fcfdeae99808f5baa2c19dd1b30bd248cbcee4573920a"} Apr 16 15:15:39.209062 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:39.209048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-749c984784-rznpg" event={"ID":"176ae89a-b65b-4a60-af6c-e66854cdd99f","Type":"ContainerStarted","Data":"4173ae36d5ca2fb2a8cacfcb2dea91b6632bd58f080a88dc020d23f6288d9b11"} Apr 16 15:15:43.325415 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:43.325376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:15:43.327628 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:43.327607 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba267359-2c95-4792-991e-a2e9eae5b290-metrics-certs\") pod \"network-metrics-daemon-x8njb\" (UID: \"ba267359-2c95-4792-991e-a2e9eae5b290\") " pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:15:43.435500 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:43.435470 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5mf5n\"" Apr 16 15:15:43.443004 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:43.442977 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x8njb" Apr 16 15:15:43.586180 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:43.586107 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x8njb"] Apr 16 15:15:43.589178 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:15:43.589146 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba267359_2c95_4792_991e_a2e9eae5b290.slice/crio-394b2fb24460f44683aea74f62c01a975e91c00b851d40d25565d4bb1282d8a9 WatchSource:0}: Error finding container 394b2fb24460f44683aea74f62c01a975e91c00b851d40d25565d4bb1282d8a9: Status 404 returned error can't find the container with id 394b2fb24460f44683aea74f62c01a975e91c00b851d40d25565d4bb1282d8a9 Apr 16 15:15:44.223193 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:44.223150 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x8njb" event={"ID":"ba267359-2c95-4792-991e-a2e9eae5b290","Type":"ContainerStarted","Data":"394b2fb24460f44683aea74f62c01a975e91c00b851d40d25565d4bb1282d8a9"} Apr 16 15:15:45.227664 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:45.227579 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x8njb" event={"ID":"ba267359-2c95-4792-991e-a2e9eae5b290","Type":"ContainerStarted","Data":"31d4199c8569045ac04e67d4e9f97f12cd6329962b3ca1047158c94d8cea3ed1"} Apr 16 15:15:45.227664 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:45.227617 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x8njb" event={"ID":"ba267359-2c95-4792-991e-a2e9eae5b290","Type":"ContainerStarted","Data":"f79c02e6fdae48c74d611583fab016d83e5cbd575a297dfff8b0bf7006e479e5"} Apr 16 15:15:45.243885 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:15:45.243591 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x8njb" podStartSLOduration=252.926072648 podStartE2EDuration="4m14.243573669s" podCreationTimestamp="2026-04-16 15:11:31 +0000 UTC" firstStartedPulling="2026-04-16 15:15:43.591143851 +0000 UTC m=+252.620715207" lastFinishedPulling="2026-04-16 15:15:44.908644869 +0000 UTC m=+253.938216228" observedRunningTime="2026-04-16 15:15:45.243052214 +0000 UTC m=+254.272623603" watchObservedRunningTime="2026-04-16 15:15:45.243573669 +0000 UTC m=+254.273145047" Apr 16 15:16:10.973423 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:16:10.973382 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" podUID="a7946e2b-4899-4e79-8237-f8184b28abd7" Apr 16 15:16:10.973423 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:16:10.973382 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9vkww" podUID="aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9" Apr 16 15:16:10.973861 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:16:10.973382 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6cpbg" podUID="21ef67e8-6503-4ba6-b6ed-bc1016b3958d" Apr 16 15:16:11.288981 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:11.288883 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9vkww" Apr 16 15:16:11.289111 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:11.288883 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:16:11.289111 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:11.288883 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:16:14.550192 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.550163 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:16:14.550736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.550203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:16:14.550736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.550230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:16:14.552576 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.552544 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9-metrics-tls\") pod \"dns-default-9vkww\" (UID: \"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9\") " pod="openshift-dns/dns-default-9vkww" Apr 16 15:16:14.552680 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.552606 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a7946e2b-4899-4e79-8237-f8184b28abd7-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tzrgf\" (UID: \"a7946e2b-4899-4e79-8237-f8184b28abd7\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:16:14.552680 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.552659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ef67e8-6503-4ba6-b6ed-bc1016b3958d-cert\") pod \"ingress-canary-6cpbg\" (UID: \"21ef67e8-6503-4ba6-b6ed-bc1016b3958d\") " pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:16:14.593521 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.593500 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7gtc5\"" Apr 16 15:16:14.593521 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.593500 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rc5hb\"" Apr 16 15:16:14.593677 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.593500 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xrkdr\"" Apr 16 15:16:14.600085 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.600067 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9vkww" Apr 16 15:16:14.600130 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.600085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6cpbg" Apr 16 15:16:14.600215 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.600203 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" Apr 16 15:16:14.755327 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.755184 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6cpbg"] Apr 16 15:16:14.757494 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:16:14.757457 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21ef67e8_6503_4ba6_b6ed_bc1016b3958d.slice/crio-d12d1fd7b6946445ae5f21372a1892970254c56c281f667592b0148a4f041535 WatchSource:0}: Error finding container d12d1fd7b6946445ae5f21372a1892970254c56c281f667592b0148a4f041535: Status 404 returned error can't find the container with id d12d1fd7b6946445ae5f21372a1892970254c56c281f667592b0148a4f041535 Apr 16 15:16:14.768894 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.768681 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9vkww"] Apr 16 15:16:14.770685 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:16:14.770660 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa9b0fcd_3a0a_43a8_8c23_a08827fb11b9.slice/crio-df4d8fe7ca44491906c4191e53b1f7a910b91c636e57f02a5e4d657093a8aa01 WatchSource:0}: Error finding container df4d8fe7ca44491906c4191e53b1f7a910b91c636e57f02a5e4d657093a8aa01: Status 404 returned error can't find the container with id df4d8fe7ca44491906c4191e53b1f7a910b91c636e57f02a5e4d657093a8aa01 Apr 16 15:16:14.789863 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:14.789839 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf"] Apr 16 15:16:14.791819 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:16:14.791792 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7946e2b_4899_4e79_8237_f8184b28abd7.slice/crio-f18a223cd253d33c0511d85b976c88e22a13a6cbd421219708f4e48dc05b5319 WatchSource:0}: Error finding container f18a223cd253d33c0511d85b976c88e22a13a6cbd421219708f4e48dc05b5319: Status 404 returned error can't find the container with id f18a223cd253d33c0511d85b976c88e22a13a6cbd421219708f4e48dc05b5319 Apr 16 15:16:15.301219 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:15.301184 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9vkww" event={"ID":"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9","Type":"ContainerStarted","Data":"df4d8fe7ca44491906c4191e53b1f7a910b91c636e57f02a5e4d657093a8aa01"} Apr 16 15:16:15.302406 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:15.302367 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" event={"ID":"a7946e2b-4899-4e79-8237-f8184b28abd7","Type":"ContainerStarted","Data":"f18a223cd253d33c0511d85b976c88e22a13a6cbd421219708f4e48dc05b5319"} Apr 16 15:16:15.303582 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:15.303559 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6cpbg" event={"ID":"21ef67e8-6503-4ba6-b6ed-bc1016b3958d","Type":"ContainerStarted","Data":"d12d1fd7b6946445ae5f21372a1892970254c56c281f667592b0148a4f041535"} Apr 16 15:16:17.309753 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:17.309712 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6cpbg" event={"ID":"21ef67e8-6503-4ba6-b6ed-bc1016b3958d","Type":"ContainerStarted","Data":"da65ad9d5260116b8a47f3cbab298e014fdd7d807b7e0579fa88c7cfa06551c1"} Apr 16 15:16:17.315687 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:17.315659 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9vkww" event={"ID":"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9","Type":"ContainerStarted","Data":"ec6dc578b47a8e49aa2c70e628f45b4f6253b48d19110291b5ed68d659e20378"} Apr 16 15:16:17.315687 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:17.315690 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9vkww" event={"ID":"aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9","Type":"ContainerStarted","Data":"539adff2236f92f8f1babd7942edda0555e2cce83834cfe983e4e0c62c15b7a7"} Apr 16 15:16:17.315941 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:17.315860 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9vkww" Apr 16 15:16:17.316828 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:17.316807 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" event={"ID":"a7946e2b-4899-4e79-8237-f8184b28abd7","Type":"ContainerStarted","Data":"2ca5a3371252e5b18b848f27d317c56474a87dae7866ae6d64b00f50b8778fb6"} Apr 16 15:16:17.328484 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:17.328449 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6cpbg" podStartSLOduration=251.219794601 podStartE2EDuration="4m13.32844009s" podCreationTimestamp="2026-04-16 15:12:04 +0000 UTC" firstStartedPulling="2026-04-16 15:16:14.759244344 +0000 UTC m=+283.788815701" lastFinishedPulling="2026-04-16 15:16:16.867889829 +0000 UTC m=+285.897461190" observedRunningTime="2026-04-16 15:16:17.327053343 +0000 UTC m=+286.356624721" watchObservedRunningTime="2026-04-16 15:16:17.32844009 +0000 UTC m=+286.358011461" Apr 16 15:16:17.345727 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:17.345645 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9vkww" podStartSLOduration=251.255131537 podStartE2EDuration="4m13.345631436s" podCreationTimestamp="2026-04-16 15:12:04 +0000 UTC" firstStartedPulling="2026-04-16 15:16:14.772581155 +0000 UTC m=+283.802152519" lastFinishedPulling="2026-04-16 15:16:16.863081063 +0000 UTC m=+285.892652418" observedRunningTime="2026-04-16 15:16:17.345611559 +0000 UTC m=+286.375182938" watchObservedRunningTime="2026-04-16 15:16:17.345631436 +0000 UTC m=+286.375202814" Apr 16 15:16:17.365885 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:17.365848 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tzrgf" podStartSLOduration=262.297796147 podStartE2EDuration="4m24.365836815s" podCreationTimestamp="2026-04-16 15:11:53 +0000 UTC" firstStartedPulling="2026-04-16 15:16:14.793551251 +0000 UTC m=+283.823122608" lastFinishedPulling="2026-04-16 15:16:16.861591905 +0000 UTC m=+285.891163276" observedRunningTime="2026-04-16 15:16:17.36569806 +0000 UTC m=+286.395269438" watchObservedRunningTime="2026-04-16 15:16:17.365836815 +0000 UTC m=+286.395408191" Apr 16 15:16:27.322539 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:27.322509 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9vkww" Apr 16 15:16:31.415528 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:31.415493 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:16:31.416005 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:31.415777 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:16:31.418225 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:16:31.418208 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:19:25.043301 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.043210 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jn4k6"] Apr 16 15:19:25.046195 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.046171 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:25.050846 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.050821 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 15:19:25.050955 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.050821 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-bqs5r\"" Apr 16 15:19:25.051028 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.050957 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 15:19:25.069739 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.069713 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jn4k6"] Apr 16 15:19:25.224467 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.224426 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0d10e1-8965-41b4-ae99-e5f37ce98da1-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jn4k6\" (UID: \"4d0d10e1-8965-41b4-ae99-e5f37ce98da1\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:25.224639 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.224482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jj4r\" (UniqueName: \"kubernetes.io/projected/4d0d10e1-8965-41b4-ae99-e5f37ce98da1-kube-api-access-6jj4r\") pod \"cert-manager-webhook-597b96b99b-jn4k6\" (UID: \"4d0d10e1-8965-41b4-ae99-e5f37ce98da1\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:25.325865 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.325836 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jj4r\" (UniqueName: \"kubernetes.io/projected/4d0d10e1-8965-41b4-ae99-e5f37ce98da1-kube-api-access-6jj4r\") pod \"cert-manager-webhook-597b96b99b-jn4k6\" (UID: \"4d0d10e1-8965-41b4-ae99-e5f37ce98da1\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:25.325967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.325901 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0d10e1-8965-41b4-ae99-e5f37ce98da1-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jn4k6\" (UID: \"4d0d10e1-8965-41b4-ae99-e5f37ce98da1\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:25.344618 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.344582 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jj4r\" (UniqueName: \"kubernetes.io/projected/4d0d10e1-8965-41b4-ae99-e5f37ce98da1-kube-api-access-6jj4r\") pod \"cert-manager-webhook-597b96b99b-jn4k6\" (UID: \"4d0d10e1-8965-41b4-ae99-e5f37ce98da1\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:25.344979 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.344963 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d0d10e1-8965-41b4-ae99-e5f37ce98da1-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jn4k6\" (UID: \"4d0d10e1-8965-41b4-ae99-e5f37ce98da1\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:25.354881 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.354864 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:25.493379 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.493268 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jn4k6"] Apr 16 15:19:25.495865 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:19:25.495836 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d0d10e1_8965_41b4_ae99_e5f37ce98da1.slice/crio-40b8e4b03ade31980a828ddc94a333e6654b75d5d1b668eee2716a14aa0e8e9f WatchSource:0}: Error finding container 40b8e4b03ade31980a828ddc94a333e6654b75d5d1b668eee2716a14aa0e8e9f: Status 404 returned error can't find the container with id 40b8e4b03ade31980a828ddc94a333e6654b75d5d1b668eee2716a14aa0e8e9f Apr 16 15:19:25.497669 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.497652 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:19:25.776124 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:25.776041 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" event={"ID":"4d0d10e1-8965-41b4-ae99-e5f37ce98da1","Type":"ContainerStarted","Data":"40b8e4b03ade31980a828ddc94a333e6654b75d5d1b668eee2716a14aa0e8e9f"} Apr 16 15:19:28.788084 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:28.788048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" event={"ID":"4d0d10e1-8965-41b4-ae99-e5f37ce98da1","Type":"ContainerStarted","Data":"1f18a888775c2117b26423a25b152e505329d1d60fea613e31e9918aae53674f"} Apr 16 15:19:28.788438 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:28.788143 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:28.808590 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:28.808532 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" podStartSLOduration=0.699995073 podStartE2EDuration="3.808521245s" podCreationTimestamp="2026-04-16 15:19:25 +0000 UTC" firstStartedPulling="2026-04-16 15:19:25.497865444 +0000 UTC m=+474.527436815" lastFinishedPulling="2026-04-16 15:19:28.606391618 +0000 UTC m=+477.635962987" observedRunningTime="2026-04-16 15:19:28.806951097 +0000 UTC m=+477.836522468" watchObservedRunningTime="2026-04-16 15:19:28.808521245 +0000 UTC m=+477.838092623" Apr 16 15:19:34.793331 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:34.793299 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jn4k6" Apr 16 15:19:40.573894 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.573861 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-h8ssp"] Apr 16 15:19:40.577027 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.577011 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-h8ssp" Apr 16 15:19:40.579772 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.579750 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-fnv2c\"" Apr 16 15:19:40.588658 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.588638 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-h8ssp"] Apr 16 15:19:40.633870 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.633843 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fddz5\" (UniqueName: \"kubernetes.io/projected/79bb8741-4be9-4812-b649-0b3141ee5bbc-kube-api-access-fddz5\") pod \"cert-manager-759f64656b-h8ssp\" (UID: \"79bb8741-4be9-4812-b649-0b3141ee5bbc\") " pod="cert-manager/cert-manager-759f64656b-h8ssp" Apr 16 15:19:40.634017 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.633879 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79bb8741-4be9-4812-b649-0b3141ee5bbc-bound-sa-token\") pod \"cert-manager-759f64656b-h8ssp\" (UID: \"79bb8741-4be9-4812-b649-0b3141ee5bbc\") " pod="cert-manager/cert-manager-759f64656b-h8ssp" Apr 16 15:19:40.734406 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.734374 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fddz5\" (UniqueName: \"kubernetes.io/projected/79bb8741-4be9-4812-b649-0b3141ee5bbc-kube-api-access-fddz5\") pod \"cert-manager-759f64656b-h8ssp\" (UID: \"79bb8741-4be9-4812-b649-0b3141ee5bbc\") " pod="cert-manager/cert-manager-759f64656b-h8ssp" Apr 16 15:19:40.734406 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.734413 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79bb8741-4be9-4812-b649-0b3141ee5bbc-bound-sa-token\") pod \"cert-manager-759f64656b-h8ssp\" (UID: \"79bb8741-4be9-4812-b649-0b3141ee5bbc\") " pod="cert-manager/cert-manager-759f64656b-h8ssp" Apr 16 15:19:40.747334 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.747308 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fddz5\" (UniqueName: \"kubernetes.io/projected/79bb8741-4be9-4812-b649-0b3141ee5bbc-kube-api-access-fddz5\") pod \"cert-manager-759f64656b-h8ssp\" (UID: \"79bb8741-4be9-4812-b649-0b3141ee5bbc\") " pod="cert-manager/cert-manager-759f64656b-h8ssp" Apr 16 15:19:40.767278 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.767247 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79bb8741-4be9-4812-b649-0b3141ee5bbc-bound-sa-token\") pod \"cert-manager-759f64656b-h8ssp\" (UID: \"79bb8741-4be9-4812-b649-0b3141ee5bbc\") " pod="cert-manager/cert-manager-759f64656b-h8ssp" Apr 16 15:19:40.885493 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:40.885403 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-h8ssp" Apr 16 15:19:41.003239 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:41.003206 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-h8ssp"] Apr 16 15:19:41.006479 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:19:41.006452 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79bb8741_4be9_4812_b649_0b3141ee5bbc.slice/crio-05b26ab69bbe6bfc4e2fb0fa5fe634f336793c295c0f1719e94bbf44e1446691 WatchSource:0}: Error finding container 05b26ab69bbe6bfc4e2fb0fa5fe634f336793c295c0f1719e94bbf44e1446691: Status 404 returned error can't find the container with id 05b26ab69bbe6bfc4e2fb0fa5fe634f336793c295c0f1719e94bbf44e1446691 Apr 16 15:19:41.822982 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:41.822948 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-h8ssp" event={"ID":"79bb8741-4be9-4812-b649-0b3141ee5bbc","Type":"ContainerStarted","Data":"8c33dcd036c2128b5fef9ac0e36b736182396fa9856b83fe6456e4da4d836fbf"} Apr 16 15:19:41.822982 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:41.822983 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-h8ssp" event={"ID":"79bb8741-4be9-4812-b649-0b3141ee5bbc","Type":"ContainerStarted","Data":"05b26ab69bbe6bfc4e2fb0fa5fe634f336793c295c0f1719e94bbf44e1446691"} Apr 16 15:19:41.842110 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:19:41.842069 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-h8ssp" podStartSLOduration=1.842057654 podStartE2EDuration="1.842057654s" podCreationTimestamp="2026-04-16 15:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:19:41.840626274 +0000 UTC m=+490.870197652" watchObservedRunningTime="2026-04-16 15:19:41.842057654 +0000 UTC m=+490.871629032" Apr 16 15:20:03.184234 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.184193 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2"] Apr 16 15:20:03.191265 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.191244 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.194446 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.194425 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 15:20:03.194446 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.194435 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 15:20:03.194617 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.194428 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 15:20:03.194617 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.194432 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 15:20:03.195288 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.195271 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-tfg64\"" Apr 16 15:20:03.277080 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.277048 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg42q\" (UniqueName: \"kubernetes.io/projected/bff12472-42d3-46f3-8d9e-a83c2d3ed06c-kube-api-access-fg42q\") pod \"opendatahub-operator-controller-manager-68df4b58f7-ljsh2\" (UID: \"bff12472-42d3-46f3-8d9e-a83c2d3ed06c\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.277239 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.277087 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bff12472-42d3-46f3-8d9e-a83c2d3ed06c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-ljsh2\" (UID: \"bff12472-42d3-46f3-8d9e-a83c2d3ed06c\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.277239 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.277171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bff12472-42d3-46f3-8d9e-a83c2d3ed06c-webhook-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-ljsh2\" (UID: \"bff12472-42d3-46f3-8d9e-a83c2d3ed06c\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.363132 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.363094 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2"] Apr 16 15:20:03.378070 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.378037 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bff12472-42d3-46f3-8d9e-a83c2d3ed06c-webhook-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-ljsh2\" (UID: \"bff12472-42d3-46f3-8d9e-a83c2d3ed06c\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.378223 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.378098 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg42q\" (UniqueName: \"kubernetes.io/projected/bff12472-42d3-46f3-8d9e-a83c2d3ed06c-kube-api-access-fg42q\") pod \"opendatahub-operator-controller-manager-68df4b58f7-ljsh2\" (UID: \"bff12472-42d3-46f3-8d9e-a83c2d3ed06c\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.378223 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.378133 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bff12472-42d3-46f3-8d9e-a83c2d3ed06c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-ljsh2\" (UID: \"bff12472-42d3-46f3-8d9e-a83c2d3ed06c\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.380522 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.380500 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bff12472-42d3-46f3-8d9e-a83c2d3ed06c-webhook-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-ljsh2\" (UID: \"bff12472-42d3-46f3-8d9e-a83c2d3ed06c\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.380606 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.380541 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bff12472-42d3-46f3-8d9e-a83c2d3ed06c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-ljsh2\" (UID: \"bff12472-42d3-46f3-8d9e-a83c2d3ed06c\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.389140 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.389117 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg42q\" (UniqueName: \"kubernetes.io/projected/bff12472-42d3-46f3-8d9e-a83c2d3ed06c-kube-api-access-fg42q\") pod \"opendatahub-operator-controller-manager-68df4b58f7-ljsh2\" (UID: \"bff12472-42d3-46f3-8d9e-a83c2d3ed06c\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.500800 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.500694 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:03.653221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.653189 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2"] Apr 16 15:20:03.656612 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:20:03.656586 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff12472_42d3_46f3_8d9e_a83c2d3ed06c.slice/crio-ebc05a90c926d8eb5c1871b9d2355ad527ba1b540186f283866aa3de70d10be3 WatchSource:0}: Error finding container ebc05a90c926d8eb5c1871b9d2355ad527ba1b540186f283866aa3de70d10be3: Status 404 returned error can't find the container with id ebc05a90c926d8eb5c1871b9d2355ad527ba1b540186f283866aa3de70d10be3 Apr 16 15:20:03.884143 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:03.884104 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" event={"ID":"bff12472-42d3-46f3-8d9e-a83c2d3ed06c","Type":"ContainerStarted","Data":"ebc05a90c926d8eb5c1871b9d2355ad527ba1b540186f283866aa3de70d10be3"} Apr 16 15:20:06.894890 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:06.894856 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" event={"ID":"bff12472-42d3-46f3-8d9e-a83c2d3ed06c","Type":"ContainerStarted","Data":"f4f4a88893a0e0ae4c0c8c25bbb6637e843109cf6f2c4cc711dfded7abbcf294"} Apr 16 15:20:06.895274 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:06.895001 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:06.920070 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:06.920016 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" podStartSLOduration=1.5132384989999998 podStartE2EDuration="3.92000157s" podCreationTimestamp="2026-04-16 15:20:03 +0000 UTC" firstStartedPulling="2026-04-16 15:20:03.658173065 +0000 UTC m=+512.687744427" lastFinishedPulling="2026-04-16 15:20:06.064936125 +0000 UTC m=+515.094507498" observedRunningTime="2026-04-16 15:20:06.918669871 +0000 UTC m=+515.948241246" watchObservedRunningTime="2026-04-16 15:20:06.92000157 +0000 UTC m=+515.949572947" Apr 16 15:20:17.902324 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:17.902296 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-ljsh2" Apr 16 15:20:24.819745 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.819709 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k"] Apr 16 15:20:24.822620 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.822603 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:24.826996 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.826973 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 15:20:24.827589 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.827574 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 15:20:24.827651 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.827591 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 15:20:24.828376 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.828355 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-p8mts\"" Apr 16 15:20:24.828485 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.828377 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 15:20:24.836161 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.836132 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k"] Apr 16 15:20:24.937261 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.937175 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtfzf\" (UniqueName: \"kubernetes.io/projected/e663dcf5-78f2-490b-94b7-cedf08e5a957-kube-api-access-mtfzf\") pod \"kube-auth-proxy-5bb547c98c-jlk6k\" (UID: \"e663dcf5-78f2-490b-94b7-cedf08e5a957\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:24.937261 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.937238 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e663dcf5-78f2-490b-94b7-cedf08e5a957-tls-certs\") pod \"kube-auth-proxy-5bb547c98c-jlk6k\" (UID: \"e663dcf5-78f2-490b-94b7-cedf08e5a957\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:24.937444 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:24.937284 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e663dcf5-78f2-490b-94b7-cedf08e5a957-tmp\") pod \"kube-auth-proxy-5bb547c98c-jlk6k\" (UID: \"e663dcf5-78f2-490b-94b7-cedf08e5a957\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:25.038308 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:25.038265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtfzf\" (UniqueName: \"kubernetes.io/projected/e663dcf5-78f2-490b-94b7-cedf08e5a957-kube-api-access-mtfzf\") pod \"kube-auth-proxy-5bb547c98c-jlk6k\" (UID: \"e663dcf5-78f2-490b-94b7-cedf08e5a957\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:25.038308 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:25.038317 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e663dcf5-78f2-490b-94b7-cedf08e5a957-tls-certs\") pod \"kube-auth-proxy-5bb547c98c-jlk6k\" (UID: \"e663dcf5-78f2-490b-94b7-cedf08e5a957\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:25.038556 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:25.038344 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e663dcf5-78f2-490b-94b7-cedf08e5a957-tmp\") pod \"kube-auth-proxy-5bb547c98c-jlk6k\" (UID: \"e663dcf5-78f2-490b-94b7-cedf08e5a957\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:25.040764 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:25.040734 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e663dcf5-78f2-490b-94b7-cedf08e5a957-tmp\") pod \"kube-auth-proxy-5bb547c98c-jlk6k\" (UID: \"e663dcf5-78f2-490b-94b7-cedf08e5a957\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:25.040993 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:25.040972 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e663dcf5-78f2-490b-94b7-cedf08e5a957-tls-certs\") pod \"kube-auth-proxy-5bb547c98c-jlk6k\" (UID: \"e663dcf5-78f2-490b-94b7-cedf08e5a957\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:25.048331 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:25.048311 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtfzf\" (UniqueName: \"kubernetes.io/projected/e663dcf5-78f2-490b-94b7-cedf08e5a957-kube-api-access-mtfzf\") pod \"kube-auth-proxy-5bb547c98c-jlk6k\" (UID: \"e663dcf5-78f2-490b-94b7-cedf08e5a957\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:25.132250 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:25.132215 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" Apr 16 15:20:25.247508 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:25.247475 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k"] Apr 16 15:20:25.251854 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:20:25.251823 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode663dcf5_78f2_490b_94b7_cedf08e5a957.slice/crio-82d92aa04aeffffcd146d0b78804fd4dcd26e304ca3820680476824c9ec16362 WatchSource:0}: Error finding container 82d92aa04aeffffcd146d0b78804fd4dcd26e304ca3820680476824c9ec16362: Status 404 returned error can't find the container with id 82d92aa04aeffffcd146d0b78804fd4dcd26e304ca3820680476824c9ec16362 Apr 16 15:20:25.947482 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:25.947446 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" event={"ID":"e663dcf5-78f2-490b-94b7-cedf08e5a957","Type":"ContainerStarted","Data":"82d92aa04aeffffcd146d0b78804fd4dcd26e304ca3820680476824c9ec16362"} Apr 16 15:20:26.889907 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:26.889873 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-jslnj"] Apr 16 15:20:26.892842 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:26.892819 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:26.895848 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:26.895824 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-k88jk\"" Apr 16 15:20:26.896084 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:26.896056 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 15:20:26.902971 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:26.902908 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-jslnj"] Apr 16 15:20:26.954762 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:26.954726 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k75mh\" (UniqueName: \"kubernetes.io/projected/2d4c80b4-e69c-47d7-b164-3260950a2215-kube-api-access-k75mh\") pod \"odh-model-controller-858dbf95b8-jslnj\" (UID: \"2d4c80b4-e69c-47d7-b164-3260950a2215\") " pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:26.955222 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:26.954798 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d4c80b4-e69c-47d7-b164-3260950a2215-cert\") pod \"odh-model-controller-858dbf95b8-jslnj\" (UID: \"2d4c80b4-e69c-47d7-b164-3260950a2215\") " pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:27.055299 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:27.055254 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k75mh\" (UniqueName: \"kubernetes.io/projected/2d4c80b4-e69c-47d7-b164-3260950a2215-kube-api-access-k75mh\") pod \"odh-model-controller-858dbf95b8-jslnj\" (UID: \"2d4c80b4-e69c-47d7-b164-3260950a2215\") " pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:27.055487 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:27.055336 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d4c80b4-e69c-47d7-b164-3260950a2215-cert\") pod \"odh-model-controller-858dbf95b8-jslnj\" (UID: \"2d4c80b4-e69c-47d7-b164-3260950a2215\") " pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:27.055568 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:20:27.055497 2567 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 15:20:27.055568 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:20:27.055568 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4c80b4-e69c-47d7-b164-3260950a2215-cert podName:2d4c80b4-e69c-47d7-b164-3260950a2215 nodeName:}" failed. No retries permitted until 2026-04-16 15:20:27.555548951 +0000 UTC m=+536.585120321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d4c80b4-e69c-47d7-b164-3260950a2215-cert") pod "odh-model-controller-858dbf95b8-jslnj" (UID: "2d4c80b4-e69c-47d7-b164-3260950a2215") : secret "odh-model-controller-webhook-cert" not found Apr 16 15:20:27.064695 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:27.064655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k75mh\" (UniqueName: \"kubernetes.io/projected/2d4c80b4-e69c-47d7-b164-3260950a2215-kube-api-access-k75mh\") pod \"odh-model-controller-858dbf95b8-jslnj\" (UID: \"2d4c80b4-e69c-47d7-b164-3260950a2215\") " pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:27.560962 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:27.560810 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d4c80b4-e69c-47d7-b164-3260950a2215-cert\") pod \"odh-model-controller-858dbf95b8-jslnj\" (UID: \"2d4c80b4-e69c-47d7-b164-3260950a2215\") " pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:27.563807 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:27.563752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d4c80b4-e69c-47d7-b164-3260950a2215-cert\") pod \"odh-model-controller-858dbf95b8-jslnj\" (UID: \"2d4c80b4-e69c-47d7-b164-3260950a2215\") " pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:27.806019 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:27.805981 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:28.215017 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:28.214993 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-jslnj"] Apr 16 15:20:28.217855 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:20:28.217822 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d4c80b4_e69c_47d7_b164_3260950a2215.slice/crio-9a3992fa8c801922ae2ef39877c19f066533baf886177709978ceefab2773efe WatchSource:0}: Error finding container 9a3992fa8c801922ae2ef39877c19f066533baf886177709978ceefab2773efe: Status 404 returned error can't find the container with id 9a3992fa8c801922ae2ef39877c19f066533baf886177709978ceefab2773efe Apr 16 15:20:28.958512 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:28.958478 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" event={"ID":"2d4c80b4-e69c-47d7-b164-3260950a2215","Type":"ContainerStarted","Data":"9a3992fa8c801922ae2ef39877c19f066533baf886177709978ceefab2773efe"} Apr 16 15:20:29.963728 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:29.963676 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" event={"ID":"e663dcf5-78f2-490b-94b7-cedf08e5a957","Type":"ContainerStarted","Data":"0f1e6525c06033a60548bf6a9a584d96dcaed245bfa583b785ab8f98dfb193b9"} Apr 16 15:20:29.981345 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:29.981270 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-jlk6k" podStartSLOduration=2.264909413 podStartE2EDuration="5.981251912s" podCreationTimestamp="2026-04-16 15:20:24 +0000 UTC" firstStartedPulling="2026-04-16 15:20:25.253831629 +0000 UTC m=+534.283402992" lastFinishedPulling="2026-04-16 15:20:28.970174136 +0000 UTC m=+537.999745491" observedRunningTime="2026-04-16 15:20:29.981037049 +0000 UTC m=+539.010608426" watchObservedRunningTime="2026-04-16 15:20:29.981251912 +0000 UTC m=+539.010823291" Apr 16 15:20:31.973078 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:31.973045 2567 generic.go:358] "Generic (PLEG): container finished" podID="2d4c80b4-e69c-47d7-b164-3260950a2215" containerID="17ce2346cb5430deb9130dd4e4dceb78c1625c1d68fd769da03615654ace787a" exitCode=1 Apr 16 15:20:31.973437 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:31.973135 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" event={"ID":"2d4c80b4-e69c-47d7-b164-3260950a2215","Type":"ContainerDied","Data":"17ce2346cb5430deb9130dd4e4dceb78c1625c1d68fd769da03615654ace787a"} Apr 16 15:20:31.973437 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:31.973312 2567 scope.go:117] "RemoveContainer" containerID="17ce2346cb5430deb9130dd4e4dceb78c1625c1d68fd769da03615654ace787a" Apr 16 15:20:32.977084 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:32.977051 2567 generic.go:358] "Generic (PLEG): container finished" podID="2d4c80b4-e69c-47d7-b164-3260950a2215" containerID="1ee8f44334821d5bc98879f2e6f2e0d759de19e98d2737e3a5637da7bda6425a" exitCode=1 Apr 16 15:20:32.977464 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:32.977090 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" event={"ID":"2d4c80b4-e69c-47d7-b164-3260950a2215","Type":"ContainerDied","Data":"1ee8f44334821d5bc98879f2e6f2e0d759de19e98d2737e3a5637da7bda6425a"} Apr 16 15:20:32.977464 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:32.977122 2567 scope.go:117] "RemoveContainer" containerID="17ce2346cb5430deb9130dd4e4dceb78c1625c1d68fd769da03615654ace787a" Apr 16 15:20:32.977464 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:32.977354 2567 scope.go:117] "RemoveContainer" containerID="1ee8f44334821d5bc98879f2e6f2e0d759de19e98d2737e3a5637da7bda6425a" Apr 16 15:20:32.977620 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:20:32.977521 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-jslnj_opendatahub(2d4c80b4-e69c-47d7-b164-3260950a2215)\"" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" podUID="2d4c80b4-e69c-47d7-b164-3260950a2215" Apr 16 15:20:33.410386 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.410348 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-fth9h"] Apr 16 15:20:33.414542 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.414525 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:33.417854 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.417827 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 15:20:33.417984 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.417858 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-z2xll\"" Apr 16 15:20:33.438223 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.438198 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-fth9h"] Apr 16 15:20:33.508357 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.508328 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c40e33f2-82ed-4240-911d-e07cd3c6f7ff-cert\") pod \"kserve-controller-manager-856948b99f-fth9h\" (UID: \"c40e33f2-82ed-4240-911d-e07cd3c6f7ff\") " pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:33.508488 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.508433 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwbcm\" (UniqueName: \"kubernetes.io/projected/c40e33f2-82ed-4240-911d-e07cd3c6f7ff-kube-api-access-pwbcm\") pod \"kserve-controller-manager-856948b99f-fth9h\" (UID: \"c40e33f2-82ed-4240-911d-e07cd3c6f7ff\") " pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:33.608895 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.608864 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbcm\" (UniqueName: \"kubernetes.io/projected/c40e33f2-82ed-4240-911d-e07cd3c6f7ff-kube-api-access-pwbcm\") pod \"kserve-controller-manager-856948b99f-fth9h\" (UID: \"c40e33f2-82ed-4240-911d-e07cd3c6f7ff\") " pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:33.608895 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.608899 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c40e33f2-82ed-4240-911d-e07cd3c6f7ff-cert\") pod \"kserve-controller-manager-856948b99f-fth9h\" (UID: \"c40e33f2-82ed-4240-911d-e07cd3c6f7ff\") " pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:33.609222 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:20:33.609204 2567 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 15:20:33.609293 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:20:33.609256 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c40e33f2-82ed-4240-911d-e07cd3c6f7ff-cert podName:c40e33f2-82ed-4240-911d-e07cd3c6f7ff nodeName:}" failed. No retries permitted until 2026-04-16 15:20:34.109242057 +0000 UTC m=+543.138813413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c40e33f2-82ed-4240-911d-e07cd3c6f7ff-cert") pod "kserve-controller-manager-856948b99f-fth9h" (UID: "c40e33f2-82ed-4240-911d-e07cd3c6f7ff") : secret "kserve-webhook-server-cert" not found Apr 16 15:20:33.621782 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.621753 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwbcm\" (UniqueName: \"kubernetes.io/projected/c40e33f2-82ed-4240-911d-e07cd3c6f7ff-kube-api-access-pwbcm\") pod \"kserve-controller-manager-856948b99f-fth9h\" (UID: \"c40e33f2-82ed-4240-911d-e07cd3c6f7ff\") " pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:33.981877 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:33.981848 2567 scope.go:117] "RemoveContainer" containerID="1ee8f44334821d5bc98879f2e6f2e0d759de19e98d2737e3a5637da7bda6425a" Apr 16 15:20:33.982266 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:20:33.982031 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-jslnj_opendatahub(2d4c80b4-e69c-47d7-b164-3260950a2215)\"" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" podUID="2d4c80b4-e69c-47d7-b164-3260950a2215" Apr 16 15:20:34.113618 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:34.113586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c40e33f2-82ed-4240-911d-e07cd3c6f7ff-cert\") pod \"kserve-controller-manager-856948b99f-fth9h\" (UID: \"c40e33f2-82ed-4240-911d-e07cd3c6f7ff\") " pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:34.115969 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:34.115949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c40e33f2-82ed-4240-911d-e07cd3c6f7ff-cert\") pod \"kserve-controller-manager-856948b99f-fth9h\" (UID: \"c40e33f2-82ed-4240-911d-e07cd3c6f7ff\") " pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:34.326258 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:34.326221 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:34.440460 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:34.440430 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-fth9h"] Apr 16 15:20:34.442834 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:20:34.442805 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc40e33f2_82ed_4240_911d_e07cd3c6f7ff.slice/crio-fb9a2987b2b148d753132d839764906a4089d3e4cb736b2d601f2d8f5e372ac5 WatchSource:0}: Error finding container fb9a2987b2b148d753132d839764906a4089d3e4cb736b2d601f2d8f5e372ac5: Status 404 returned error can't find the container with id fb9a2987b2b148d753132d839764906a4089d3e4cb736b2d601f2d8f5e372ac5 Apr 16 15:20:34.985571 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:34.985536 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" event={"ID":"c40e33f2-82ed-4240-911d-e07cd3c6f7ff","Type":"ContainerStarted","Data":"fb9a2987b2b148d753132d839764906a4089d3e4cb736b2d601f2d8f5e372ac5"} Apr 16 15:20:37.806990 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:37.806956 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:37.807369 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:37.807356 2567 scope.go:117] "RemoveContainer" containerID="1ee8f44334821d5bc98879f2e6f2e0d759de19e98d2737e3a5637da7bda6425a" Apr 16 15:20:37.807527 ip-10-0-135-252 kubenswrapper[2567]: E0416 15:20:37.807510 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-jslnj_opendatahub(2d4c80b4-e69c-47d7-b164-3260950a2215)\"" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" podUID="2d4c80b4-e69c-47d7-b164-3260950a2215" Apr 16 15:20:39.659982 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.659946 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg"] Apr 16 15:20:39.663332 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.663305 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:39.666089 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.666065 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 15:20:39.666220 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.666172 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-nv8pc\"" Apr 16 15:20:39.666753 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.666735 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 15:20:39.677665 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.677644 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg"] Apr 16 15:20:39.757579 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.757544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8scs6\" (UniqueName: \"kubernetes.io/projected/434018c9-3bb3-4e95-861e-4b50c83cb345-kube-api-access-8scs6\") pod \"servicemesh-operator3-55f49c5f94-cnsfg\" (UID: \"434018c9-3bb3-4e95-861e-4b50c83cb345\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:39.757730 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.757589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/434018c9-3bb3-4e95-861e-4b50c83cb345-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cnsfg\" (UID: \"434018c9-3bb3-4e95-861e-4b50c83cb345\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:39.858909 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.858868 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/434018c9-3bb3-4e95-861e-4b50c83cb345-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cnsfg\" (UID: \"434018c9-3bb3-4e95-861e-4b50c83cb345\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:39.859098 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.858976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8scs6\" (UniqueName: \"kubernetes.io/projected/434018c9-3bb3-4e95-861e-4b50c83cb345-kube-api-access-8scs6\") pod \"servicemesh-operator3-55f49c5f94-cnsfg\" (UID: \"434018c9-3bb3-4e95-861e-4b50c83cb345\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:39.861981 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.861956 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/434018c9-3bb3-4e95-861e-4b50c83cb345-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cnsfg\" (UID: \"434018c9-3bb3-4e95-861e-4b50c83cb345\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:39.871894 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.871863 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8scs6\" (UniqueName: \"kubernetes.io/projected/434018c9-3bb3-4e95-861e-4b50c83cb345-kube-api-access-8scs6\") pod \"servicemesh-operator3-55f49c5f94-cnsfg\" (UID: \"434018c9-3bb3-4e95-861e-4b50c83cb345\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:39.974290 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:39.974208 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:40.120239 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:40.120207 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg"] Apr 16 15:20:40.122956 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:20:40.122910 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod434018c9_3bb3_4e95_861e_4b50c83cb345.slice/crio-fabd66043c61dea9ee7029dcbb1245243f7f7fbeac02115be90e631cd9aef59d WatchSource:0}: Error finding container fabd66043c61dea9ee7029dcbb1245243f7f7fbeac02115be90e631cd9aef59d: Status 404 returned error can't find the container with id fabd66043c61dea9ee7029dcbb1245243f7f7fbeac02115be90e631cd9aef59d Apr 16 15:20:41.003036 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:41.002999 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" event={"ID":"434018c9-3bb3-4e95-861e-4b50c83cb345","Type":"ContainerStarted","Data":"fabd66043c61dea9ee7029dcbb1245243f7f7fbeac02115be90e631cd9aef59d"} Apr 16 15:20:42.008614 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:42.008572 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" event={"ID":"c40e33f2-82ed-4240-911d-e07cd3c6f7ff","Type":"ContainerStarted","Data":"95aa1bca830e69a0ae06377df9b4210f27cffa29de9d9301b2a73be28d428e1e"} Apr 16 15:20:42.009091 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:42.008757 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:20:42.028002 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:42.027946 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" podStartSLOduration=2.2648160969999998 podStartE2EDuration="9.027917043s" podCreationTimestamp="2026-04-16 15:20:33 +0000 UTC" firstStartedPulling="2026-04-16 15:20:34.444196127 +0000 UTC m=+543.473767486" lastFinishedPulling="2026-04-16 15:20:41.207297061 +0000 UTC m=+550.236868432" observedRunningTime="2026-04-16 15:20:42.026891473 +0000 UTC m=+551.056462862" watchObservedRunningTime="2026-04-16 15:20:42.027917043 +0000 UTC m=+551.057488424" Apr 16 15:20:44.019872 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:44.019840 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" event={"ID":"434018c9-3bb3-4e95-861e-4b50c83cb345","Type":"ContainerStarted","Data":"5ac7d8155916bbd002758b11faa76981aac3f12b967a1d2d31a73b58fdfbbf48"} Apr 16 15:20:44.020250 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:44.019954 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:44.040963 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:44.040904 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" podStartSLOduration=2.128355189 podStartE2EDuration="5.040892018s" podCreationTimestamp="2026-04-16 15:20:39 +0000 UTC" firstStartedPulling="2026-04-16 15:20:40.125482749 +0000 UTC m=+549.155054108" lastFinishedPulling="2026-04-16 15:20:43.038019577 +0000 UTC m=+552.067590937" observedRunningTime="2026-04-16 15:20:44.039325564 +0000 UTC m=+553.068896942" watchObservedRunningTime="2026-04-16 15:20:44.040892018 +0000 UTC m=+553.070463395" Apr 16 15:20:47.806752 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:47.806718 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:47.807237 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:47.807094 2567 scope.go:117] "RemoveContainer" containerID="1ee8f44334821d5bc98879f2e6f2e0d759de19e98d2737e3a5637da7bda6425a" Apr 16 15:20:49.036575 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:49.036539 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" event={"ID":"2d4c80b4-e69c-47d7-b164-3260950a2215","Type":"ContainerStarted","Data":"d7b97833a1c0b3ed341af7e2f5a0e01f3949b0c45a3a80db40081ceb7aa15a6d"} Apr 16 15:20:49.037055 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:49.036848 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:20:49.056016 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:49.055972 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" podStartSLOduration=3.216342317 podStartE2EDuration="23.055959706s" podCreationTimestamp="2026-04-16 15:20:26 +0000 UTC" firstStartedPulling="2026-04-16 15:20:28.219634064 +0000 UTC m=+537.249205421" lastFinishedPulling="2026-04-16 15:20:48.05925145 +0000 UTC m=+557.088822810" observedRunningTime="2026-04-16 15:20:49.055369336 +0000 UTC m=+558.084940738" watchObservedRunningTime="2026-04-16 15:20:49.055959706 +0000 UTC m=+558.085531083" Apr 16 15:20:54.550749 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.550715 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg"] Apr 16 15:20:54.553880 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.553856 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.557548 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.557527 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 15:20:54.557972 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.557956 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 15:20:54.558306 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.558292 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 15:20:54.560209 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.560193 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-vj7d2\"" Apr 16 15:20:54.560448 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.560426 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 15:20:54.577544 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.577522 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg"] Apr 16 15:20:54.669562 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.669522 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lbx6\" (UniqueName: \"kubernetes.io/projected/62e19e13-4da7-4de9-b6d5-caf54792d3fc-kube-api-access-5lbx6\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.669740 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.669584 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.669740 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.669611 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.669740 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.669638 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.669855 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.669754 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/62e19e13-4da7-4de9-b6d5-caf54792d3fc-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.669855 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.669794 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/62e19e13-4da7-4de9-b6d5-caf54792d3fc-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.669855 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.669818 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.772060 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.772020 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/62e19e13-4da7-4de9-b6d5-caf54792d3fc-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.772248 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.772080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/62e19e13-4da7-4de9-b6d5-caf54792d3fc-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.772248 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.772108 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.772248 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.772138 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lbx6\" (UniqueName: \"kubernetes.io/projected/62e19e13-4da7-4de9-b6d5-caf54792d3fc-kube-api-access-5lbx6\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.772248 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.772178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.772248 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.772207 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.772248 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.772238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.773269 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.772838 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.776191 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.776160 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/62e19e13-4da7-4de9-b6d5-caf54792d3fc-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.776308 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.776190 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/62e19e13-4da7-4de9-b6d5-caf54792d3fc-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.776392 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.776369 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.776548 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.776531 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.781238 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.781218 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lbx6\" (UniqueName: \"kubernetes.io/projected/62e19e13-4da7-4de9-b6d5-caf54792d3fc-kube-api-access-5lbx6\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.781502 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.781482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/62e19e13-4da7-4de9-b6d5-caf54792d3fc-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4rtzg\" (UID: \"62e19e13-4da7-4de9-b6d5-caf54792d3fc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.866321 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.866288 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:54.998841 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:54.998812 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg"] Apr 16 15:20:55.001453 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:20:55.001428 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62e19e13_4da7_4de9_b6d5_caf54792d3fc.slice/crio-0226b3cc9ace22678bf53e4b6e07301cf382abe37f5f36e0314a6e859822316d WatchSource:0}: Error finding container 0226b3cc9ace22678bf53e4b6e07301cf382abe37f5f36e0314a6e859822316d: Status 404 returned error can't find the container with id 0226b3cc9ace22678bf53e4b6e07301cf382abe37f5f36e0314a6e859822316d Apr 16 15:20:55.024505 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:55.024483 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cnsfg" Apr 16 15:20:55.058502 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:55.058466 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" event={"ID":"62e19e13-4da7-4de9-b6d5-caf54792d3fc","Type":"ContainerStarted","Data":"0226b3cc9ace22678bf53e4b6e07301cf382abe37f5f36e0314a6e859822316d"} Apr 16 15:20:57.519770 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:57.519733 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:20:57.520196 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:57.519795 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:20:58.071256 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:58.071224 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" event={"ID":"62e19e13-4da7-4de9-b6d5-caf54792d3fc","Type":"ContainerStarted","Data":"8c7ad8a37040d7e071f2da2002b40657ad4a126811455d010aa6f1f061f6a207"} Apr 16 15:20:58.071484 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:58.071455 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:20:58.072794 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:58.072766 2567 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-4rtzg container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 15:20:58.072896 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:58.072821 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" podUID="62e19e13-4da7-4de9-b6d5-caf54792d3fc" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:20:58.094899 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:58.094853 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" podStartSLOduration=1.579305448 podStartE2EDuration="4.09484097s" podCreationTimestamp="2026-04-16 15:20:54 +0000 UTC" firstStartedPulling="2026-04-16 15:20:55.003970936 +0000 UTC m=+564.033542298" lastFinishedPulling="2026-04-16 15:20:57.519506461 +0000 UTC m=+566.549077820" observedRunningTime="2026-04-16 15:20:58.094058513 +0000 UTC m=+567.123629893" watchObservedRunningTime="2026-04-16 15:20:58.09484097 +0000 UTC m=+567.124412348" Apr 16 15:20:59.075937 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:20:59.075907 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4rtzg" Apr 16 15:21:00.042690 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:21:00.042656 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-jslnj" Apr 16 15:21:13.020299 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:21:13.020267 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-fth9h" Apr 16 15:21:31.436971 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:21:31.436945 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:21:31.437415 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:21:31.436946 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:22:02.620853 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.620768 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-2j82l"] Apr 16 15:22:02.623178 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.623157 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-2j82l" Apr 16 15:22:02.627444 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.627420 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-mt2tc\"" Apr 16 15:22:02.627572 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.627479 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 15:22:02.628474 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.628459 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 15:22:02.644605 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.644576 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-2j82l"] Apr 16 15:22:02.805411 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.805374 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zscv9\" (UniqueName: \"kubernetes.io/projected/80d2cb48-658d-409a-bf79-90e5fc42ee50-kube-api-access-zscv9\") pod \"authorino-operator-657f44b778-2j82l\" (UID: \"80d2cb48-658d-409a-bf79-90e5fc42ee50\") " pod="kuadrant-system/authorino-operator-657f44b778-2j82l" Apr 16 15:22:02.906036 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.905941 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zscv9\" (UniqueName: \"kubernetes.io/projected/80d2cb48-658d-409a-bf79-90e5fc42ee50-kube-api-access-zscv9\") pod \"authorino-operator-657f44b778-2j82l\" (UID: \"80d2cb48-658d-409a-bf79-90e5fc42ee50\") " pod="kuadrant-system/authorino-operator-657f44b778-2j82l" Apr 16 15:22:02.914898 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.914863 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zscv9\" (UniqueName: \"kubernetes.io/projected/80d2cb48-658d-409a-bf79-90e5fc42ee50-kube-api-access-zscv9\") pod \"authorino-operator-657f44b778-2j82l\" (UID: \"80d2cb48-658d-409a-bf79-90e5fc42ee50\") " pod="kuadrant-system/authorino-operator-657f44b778-2j82l" Apr 16 15:22:02.932693 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:02.932664 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-2j82l" Apr 16 15:22:03.054438 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:03.054407 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-2j82l"] Apr 16 15:22:03.057787 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:22:03.057759 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80d2cb48_658d_409a_bf79_90e5fc42ee50.slice/crio-20b7bef276b058dffe6bc18a3cf3bc1ad4359e65ef14d64c55e7204c76aa901b WatchSource:0}: Error finding container 20b7bef276b058dffe6bc18a3cf3bc1ad4359e65ef14d64c55e7204c76aa901b: Status 404 returned error can't find the container with id 20b7bef276b058dffe6bc18a3cf3bc1ad4359e65ef14d64c55e7204c76aa901b Apr 16 15:22:03.272604 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:03.272530 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-2j82l" event={"ID":"80d2cb48-658d-409a-bf79-90e5fc42ee50","Type":"ContainerStarted","Data":"20b7bef276b058dffe6bc18a3cf3bc1ad4359e65ef14d64c55e7204c76aa901b"} Apr 16 15:22:05.281201 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:05.281160 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-2j82l" event={"ID":"80d2cb48-658d-409a-bf79-90e5fc42ee50","Type":"ContainerStarted","Data":"52f25178626d8469f6deb1b5302ea445d92a38fb86012cc5c884bf3aec7a2f0f"} Apr 16 15:22:05.281607 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:05.281294 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-2j82l" Apr 16 15:22:05.307769 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:05.307722 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-2j82l" podStartSLOduration=1.73422794 podStartE2EDuration="3.307708473s" podCreationTimestamp="2026-04-16 15:22:02 +0000 UTC" firstStartedPulling="2026-04-16 15:22:03.060096251 +0000 UTC m=+632.089667612" lastFinishedPulling="2026-04-16 15:22:04.633576789 +0000 UTC m=+633.663148145" observedRunningTime="2026-04-16 15:22:05.305487042 +0000 UTC m=+634.335058420" watchObservedRunningTime="2026-04-16 15:22:05.307708473 +0000 UTC m=+634.337279850" Apr 16 15:22:16.287281 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:22:16.287251 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-2j82l" Apr 16 15:23:01.985571 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:01.985540 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:23:01.987824 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:01.987800 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:01.990521 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:01.990502 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-mjgck\"" Apr 16 15:23:01.990617 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:01.990538 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 15:23:01.996397 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:01.996373 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:23:02.010282 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.010251 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:23:02.137412 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.137378 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf6cq\" (UniqueName: \"kubernetes.io/projected/b3287a5b-39e8-4940-b76e-825d9517dcd1-kube-api-access-tf6cq\") pod \"limitador-limitador-78c99df468-56m2g\" (UID: \"b3287a5b-39e8-4940-b76e-825d9517dcd1\") " pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:02.137581 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.137427 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b3287a5b-39e8-4940-b76e-825d9517dcd1-config-file\") pod \"limitador-limitador-78c99df468-56m2g\" (UID: \"b3287a5b-39e8-4940-b76e-825d9517dcd1\") " pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:02.237978 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.237902 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b3287a5b-39e8-4940-b76e-825d9517dcd1-config-file\") pod \"limitador-limitador-78c99df468-56m2g\" (UID: \"b3287a5b-39e8-4940-b76e-825d9517dcd1\") " pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:02.238114 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.238013 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf6cq\" (UniqueName: \"kubernetes.io/projected/b3287a5b-39e8-4940-b76e-825d9517dcd1-kube-api-access-tf6cq\") pod \"limitador-limitador-78c99df468-56m2g\" (UID: \"b3287a5b-39e8-4940-b76e-825d9517dcd1\") " pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:02.238583 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.238564 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b3287a5b-39e8-4940-b76e-825d9517dcd1-config-file\") pod \"limitador-limitador-78c99df468-56m2g\" (UID: \"b3287a5b-39e8-4940-b76e-825d9517dcd1\") " pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:02.245861 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.245837 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf6cq\" (UniqueName: \"kubernetes.io/projected/b3287a5b-39e8-4940-b76e-825d9517dcd1-kube-api-access-tf6cq\") pod \"limitador-limitador-78c99df468-56m2g\" (UID: \"b3287a5b-39e8-4940-b76e-825d9517dcd1\") " pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:02.297518 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.297483 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:02.424717 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.424688 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:23:02.427356 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:23:02.427329 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3287a5b_39e8_4940_b76e_825d9517dcd1.slice/crio-2a5b8d77887dc55f0b6d638d26c6e2109b4ad7e77b8d71fc267b60ba74b012b0 WatchSource:0}: Error finding container 2a5b8d77887dc55f0b6d638d26c6e2109b4ad7e77b8d71fc267b60ba74b012b0: Status 404 returned error can't find the container with id 2a5b8d77887dc55f0b6d638d26c6e2109b4ad7e77b8d71fc267b60ba74b012b0 Apr 16 15:23:02.461414 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:02.461380 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" event={"ID":"b3287a5b-39e8-4940-b76e-825d9517dcd1","Type":"ContainerStarted","Data":"2a5b8d77887dc55f0b6d638d26c6e2109b4ad7e77b8d71fc267b60ba74b012b0"} Apr 16 15:23:05.477166 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:05.477127 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" event={"ID":"b3287a5b-39e8-4940-b76e-825d9517dcd1","Type":"ContainerStarted","Data":"dacd632ee12bf31de22280b120d90f978b20171b9df349d40ac5971c698f08dc"} Apr 16 15:23:05.477499 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:05.477195 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:05.495388 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:05.495336 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" podStartSLOduration=2.053963764 podStartE2EDuration="4.495323051s" podCreationTimestamp="2026-04-16 15:23:01 +0000 UTC" firstStartedPulling="2026-04-16 15:23:02.429043353 +0000 UTC m=+691.458614709" lastFinishedPulling="2026-04-16 15:23:04.87040263 +0000 UTC m=+693.899973996" observedRunningTime="2026-04-16 15:23:05.494313515 +0000 UTC m=+694.523884887" watchObservedRunningTime="2026-04-16 15:23:05.495323051 +0000 UTC m=+694.524894429" Apr 16 15:23:16.481515 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:16.481480 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-56m2g" Apr 16 15:23:32.804275 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:23:32.804241 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:24:06.277216 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.277182 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns"] Apr 16 15:24:06.279679 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.279657 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.283566 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.283537 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 15:24:06.283688 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.283588 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 15:24:06.283688 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.283602 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 15:24:06.283688 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.283602 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-wnsmp\"" Apr 16 15:24:06.290691 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.290670 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns"] Apr 16 15:24:06.316823 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.316792 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3285923c-92f4-489a-ac6a-0054bda94650-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.316994 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.316841 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.316994 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.316912 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.316994 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.316966 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675wb\" (UniqueName: \"kubernetes.io/projected/3285923c-92f4-489a-ac6a-0054bda94650-kube-api-access-675wb\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.317163 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.317040 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.317163 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.317072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.418066 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.418033 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3285923c-92f4-489a-ac6a-0054bda94650-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.418241 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.418077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.418241 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.418108 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.418241 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.418125 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-675wb\" (UniqueName: \"kubernetes.io/projected/3285923c-92f4-489a-ac6a-0054bda94650-kube-api-access-675wb\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.418241 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.418156 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.418241 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.418174 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.418638 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.418613 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.418743 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.418727 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.418806 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.418775 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.420383 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.420361 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3285923c-92f4-489a-ac6a-0054bda94650-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.420659 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.420643 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3285923c-92f4-489a-ac6a-0054bda94650-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.426758 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.426738 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-675wb\" (UniqueName: \"kubernetes.io/projected/3285923c-92f4-489a-ac6a-0054bda94650-kube-api-access-675wb\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns\" (UID: \"3285923c-92f4-489a-ac6a-0054bda94650\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.590177 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.590146 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:06.712217 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.712193 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns"] Apr 16 15:24:06.714853 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:24:06.714827 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3285923c_92f4_489a_ac6a_0054bda94650.slice/crio-8e95d432d442820caa694f13b7fc00a938fd164c88cbb2db581d3c68f614c271 WatchSource:0}: Error finding container 8e95d432d442820caa694f13b7fc00a938fd164c88cbb2db581d3c68f614c271: Status 404 returned error can't find the container with id 8e95d432d442820caa694f13b7fc00a938fd164c88cbb2db581d3c68f614c271 Apr 16 15:24:06.722450 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:06.722426 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:24:07.672304 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:07.672275 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" event={"ID":"3285923c-92f4-489a-ac6a-0054bda94650","Type":"ContainerStarted","Data":"8e95d432d442820caa694f13b7fc00a938fd164c88cbb2db581d3c68f614c271"} Apr 16 15:24:09.210768 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:09.210658 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:24:13.704761 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:13.704723 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" event={"ID":"3285923c-92f4-489a-ac6a-0054bda94650","Type":"ContainerStarted","Data":"3fd2a96d9f859aeab58c536273a9ce139f13bdfb27db272a8e0de35bb7734264"} Apr 16 15:24:17.605115 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:17.605081 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:24:19.727617 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:19.727580 2567 generic.go:358] "Generic (PLEG): container finished" podID="3285923c-92f4-489a-ac6a-0054bda94650" containerID="3fd2a96d9f859aeab58c536273a9ce139f13bdfb27db272a8e0de35bb7734264" exitCode=0 Apr 16 15:24:19.728000 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:19.727656 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" event={"ID":"3285923c-92f4-489a-ac6a-0054bda94650","Type":"ContainerDied","Data":"3fd2a96d9f859aeab58c536273a9ce139f13bdfb27db272a8e0de35bb7734264"} Apr 16 15:24:22.491251 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:22.491218 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:24:23.746687 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:23.746609 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" event={"ID":"3285923c-92f4-489a-ac6a-0054bda94650","Type":"ContainerStarted","Data":"3ccef862d5c961feb20e3d12b829c2075f9fc4f891a79ce6642777194dd369d7"} Apr 16 15:24:23.747087 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:23.746840 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:23.766145 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:23.766103 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" podStartSLOduration=1.010267265 podStartE2EDuration="17.766088874s" podCreationTimestamp="2026-04-16 15:24:06 +0000 UTC" firstStartedPulling="2026-04-16 15:24:06.716956176 +0000 UTC m=+755.746527533" lastFinishedPulling="2026-04-16 15:24:23.472777785 +0000 UTC m=+772.502349142" observedRunningTime="2026-04-16 15:24:23.764595028 +0000 UTC m=+772.794166406" watchObservedRunningTime="2026-04-16 15:24:23.766088874 +0000 UTC m=+772.795660253" Apr 16 15:24:30.611586 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:30.611550 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:24:34.762475 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:34.762440 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns" Apr 16 15:24:44.883095 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:44.883057 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq"] Apr 16 15:24:44.904682 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:44.904652 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq"] Apr 16 15:24:44.904840 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:44.904793 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:44.907871 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:44.907846 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 15:24:45.052039 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.051998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2db1ce9c-0141-4f42-a7f3-947057f04385-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.052221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.052048 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvgzj\" (UniqueName: \"kubernetes.io/projected/2db1ce9c-0141-4f42-a7f3-947057f04385-kube-api-access-kvgzj\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.052221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.052078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.052221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.052157 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.052221 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.052189 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.052396 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.052236 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.153066 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.152981 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.153066 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.153049 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.153273 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.153077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.153273 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.153093 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.153273 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.153152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2db1ce9c-0141-4f42-a7f3-947057f04385-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.153273 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.153182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvgzj\" (UniqueName: \"kubernetes.io/projected/2db1ce9c-0141-4f42-a7f3-947057f04385-kube-api-access-kvgzj\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.153472 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.153362 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.153472 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.153422 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.153616 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.153594 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.155347 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.155326 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2db1ce9c-0141-4f42-a7f3-947057f04385-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.155574 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.155556 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2db1ce9c-0141-4f42-a7f3-947057f04385-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.165200 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.165177 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvgzj\" (UniqueName: \"kubernetes.io/projected/2db1ce9c-0141-4f42-a7f3-947057f04385-kube-api-access-kvgzj\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq\" (UID: \"2db1ce9c-0141-4f42-a7f3-947057f04385\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.214599 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.214572 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:45.347131 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.347096 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq"] Apr 16 15:24:45.350328 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:24:45.350298 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db1ce9c_0141_4f42_a7f3_947057f04385.slice/crio-9e4a04bd4bc1d479bd265081831597ccee1116d8d836a09c69a9ee4afb6a706d WatchSource:0}: Error finding container 9e4a04bd4bc1d479bd265081831597ccee1116d8d836a09c69a9ee4afb6a706d: Status 404 returned error can't find the container with id 9e4a04bd4bc1d479bd265081831597ccee1116d8d836a09c69a9ee4afb6a706d Apr 16 15:24:45.352020 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.352005 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:24:45.770990 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.770893 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6"] Apr 16 15:24:45.774158 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.774140 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.777115 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.777094 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 15:24:45.784922 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.784898 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6"] Apr 16 15:24:45.815187 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.815159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" event={"ID":"2db1ce9c-0141-4f42-a7f3-947057f04385","Type":"ContainerStarted","Data":"1b8b5f0364213f685a25ff9a55d4b8f53c90b92433ec7faf64df132b753e3cdc"} Apr 16 15:24:45.815328 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.815189 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" event={"ID":"2db1ce9c-0141-4f42-a7f3-947057f04385","Type":"ContainerStarted","Data":"9e4a04bd4bc1d479bd265081831597ccee1116d8d836a09c69a9ee4afb6a706d"} Apr 16 15:24:45.859381 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.859340 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-955jx\" (UniqueName: \"kubernetes.io/projected/6b6616a1-20fd-4969-b123-c15ee9ad8add-kube-api-access-955jx\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.859622 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.859427 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.859622 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.859544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6616a1-20fd-4969-b123-c15ee9ad8add-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.859762 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.859662 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.859762 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.859710 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.859762 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.859757 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.896830 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.896794 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:24:45.960609 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.960571 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6616a1-20fd-4969-b123-c15ee9ad8add-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.960758 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.960657 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.960758 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.960706 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.960758 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.960735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.960921 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.960815 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-955jx\" (UniqueName: \"kubernetes.io/projected/6b6616a1-20fd-4969-b123-c15ee9ad8add-kube-api-access-955jx\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.960921 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.960850 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.961186 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.961146 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.961288 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.961187 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.961422 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.961403 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.962975 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.962954 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b6616a1-20fd-4969-b123-c15ee9ad8add-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.963628 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.963610 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6616a1-20fd-4969-b123-c15ee9ad8add-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:45.970601 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:45.970574 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-955jx\" (UniqueName: \"kubernetes.io/projected/6b6616a1-20fd-4969-b123-c15ee9ad8add-kube-api-access-955jx\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jf9j6\" (UID: \"6b6616a1-20fd-4969-b123-c15ee9ad8add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:46.087570 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:46.087534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:46.220303 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:46.220131 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6"] Apr 16 15:24:46.820473 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:46.820399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" event={"ID":"6b6616a1-20fd-4969-b123-c15ee9ad8add","Type":"ContainerStarted","Data":"249ad72f00935286b3e958ba8c605e9406b5a9b6cf8044e58929ef3d366bd760"} Apr 16 15:24:46.820473 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:46.820441 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" event={"ID":"6b6616a1-20fd-4969-b123-c15ee9ad8add","Type":"ContainerStarted","Data":"515910da4e9b0ba5cf9790c59f1dd5a77019e63c27e86822b0c135f39ce77776"} Apr 16 15:24:49.489958 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:49.489915 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:24:51.842830 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:51.842794 2567 generic.go:358] "Generic (PLEG): container finished" podID="2db1ce9c-0141-4f42-a7f3-947057f04385" containerID="1b8b5f0364213f685a25ff9a55d4b8f53c90b92433ec7faf64df132b753e3cdc" exitCode=0 Apr 16 15:24:51.843258 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:51.842864 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" event={"ID":"2db1ce9c-0141-4f42-a7f3-947057f04385","Type":"ContainerDied","Data":"1b8b5f0364213f685a25ff9a55d4b8f53c90b92433ec7faf64df132b753e3cdc"} Apr 16 15:24:51.844327 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:51.844307 2567 generic.go:358] "Generic (PLEG): container finished" podID="6b6616a1-20fd-4969-b123-c15ee9ad8add" containerID="249ad72f00935286b3e958ba8c605e9406b5a9b6cf8044e58929ef3d366bd760" exitCode=0 Apr 16 15:24:51.844407 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:51.844346 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" event={"ID":"6b6616a1-20fd-4969-b123-c15ee9ad8add","Type":"ContainerDied","Data":"249ad72f00935286b3e958ba8c605e9406b5a9b6cf8044e58929ef3d366bd760"} Apr 16 15:24:52.849446 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:52.849409 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" event={"ID":"2db1ce9c-0141-4f42-a7f3-947057f04385","Type":"ContainerStarted","Data":"3b99c8674eed6de61ad37b0aa5471ee581d2e860a109baa1692e127d5e772be9"} Apr 16 15:24:52.849919 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:52.849627 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:24:52.851062 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:52.851038 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" event={"ID":"6b6616a1-20fd-4969-b123-c15ee9ad8add","Type":"ContainerStarted","Data":"aadfe02f4622a3986b32efa6eb31f5811c6b7bb5fe69b7299725702130799216"} Apr 16 15:24:52.851245 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:52.851229 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:24:52.871032 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:52.870990 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" podStartSLOduration=8.597013528 podStartE2EDuration="8.87097878s" podCreationTimestamp="2026-04-16 15:24:44 +0000 UTC" firstStartedPulling="2026-04-16 15:24:51.843545844 +0000 UTC m=+800.873117200" lastFinishedPulling="2026-04-16 15:24:52.117511085 +0000 UTC m=+801.147082452" observedRunningTime="2026-04-16 15:24:52.868883657 +0000 UTC m=+801.898455034" watchObservedRunningTime="2026-04-16 15:24:52.87097878 +0000 UTC m=+801.900550157" Apr 16 15:24:52.889212 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:24:52.889172 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" podStartSLOduration=7.612597824 podStartE2EDuration="7.889160579s" podCreationTimestamp="2026-04-16 15:24:45 +0000 UTC" firstStartedPulling="2026-04-16 15:24:51.844908185 +0000 UTC m=+800.874479542" lastFinishedPulling="2026-04-16 15:24:52.121470933 +0000 UTC m=+801.151042297" observedRunningTime="2026-04-16 15:24:52.887258596 +0000 UTC m=+801.916829974" watchObservedRunningTime="2026-04-16 15:24:52.889160579 +0000 UTC m=+801.918731957" Apr 16 15:25:03.866987 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:25:03.866959 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq" Apr 16 15:25:03.867683 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:25:03.867665 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jf9j6" Apr 16 15:26:16.203087 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:26:16.203056 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:26:26.695848 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:26:26.695815 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:26:31.464066 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:26:31.464038 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:26:31.464462 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:26:31.464447 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:26:34.793318 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:26:34.793284 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:26:45.422157 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:26:45.422121 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:26:54.491128 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:26:54.491094 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:27:05.589755 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:27:05.589724 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:28:06.399527 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:28:06.399453 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:28:21.792130 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:28:21.792096 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:29:01.006415 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:29:01.006379 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:29:16.989454 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:29:16.989416 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:29:31.788737 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:29:31.788663 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:29:47.690692 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:29:47.690663 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:30:39.686844 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:30:39.686807 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:30:49.694995 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:30:49.694956 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:31:05.991583 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:31:05.991503 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:31:13.887889 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:31:13.887852 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:31:31.489033 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:31:31.489002 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:31:31.489785 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:31:31.489765 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:31:31.491065 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:31:31.491045 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:31:39.394854 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:31:39.394816 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:32:11.888572 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:32:11.888532 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:32:20.892624 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:32:20.892545 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:32:29.587507 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:32:29.587472 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:32:36.891859 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:32:36.891816 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:32:46.088497 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:32:46.088460 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:33:03.188515 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:33:03.188476 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:33:16.393001 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:33:16.392962 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:34:02.995335 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:34:02.995250 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:34:10.894562 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:34:10.894527 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:34:20.387644 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:34:20.387612 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:34:28.595116 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:34:28.595080 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:34:37.887122 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:34:37.887084 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:34:46.092188 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:34:46.092153 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:34:55.091247 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:34:55.091205 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:35:03.285273 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:35:03.285237 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:35:12.489901 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:35:12.489864 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:35:20.995602 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:35:20.995525 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:35:30.192706 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:35:30.192674 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:35:38.588025 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:35:38.587993 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:35:47.595211 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:35:47.595173 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:35:55.695438 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:35:55.695399 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:36:04.791091 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:36:04.791045 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:36:13.194201 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:36:13.194162 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:36:22.590138 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:36:22.590100 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:36:30.797288 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:36:30.797254 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:36:31.510565 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:36:31.510537 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:36:31.512429 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:36:31.512409 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:38:47.396941 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:38:47.396904 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:38:53.493735 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:38:53.493701 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:39:18.491998 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:39:18.491958 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:39:23.586415 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:39:23.586376 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:39:34.088670 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:39:34.088633 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:39:43.505213 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:39:43.505180 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:39:52.798550 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:39:52.798463 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:40:03.621736 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:40:03.621700 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:40:12.493801 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:40:12.493763 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:40:22.597686 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:40:22.597651 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:40:31.698362 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:40:31.698329 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:40:41.986910 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:40:41.986875 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:40:51.290548 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:40:51.290508 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:41:26.489576 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:41:26.489500 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:41:31.532488 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:41:31.532459 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:41:31.534290 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:41:31.534269 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:42:08.089808 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:42:08.089773 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:42:17.096586 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:42:17.096547 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:42:26.587354 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:42:26.587313 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:42:35.283201 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:42:35.283163 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:42:43.696665 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:42:43.696626 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:42:55.583168 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:42:55.583090 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:43:04.195881 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:43:04.195844 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:43:12.188047 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:43:12.188010 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:43:21.191404 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:43:21.191367 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:43:29.195637 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:43:29.195600 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:43:38.298919 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:43:38.298883 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:43:50.192583 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:43:50.192547 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:44:08.485627 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:44:08.485592 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:44:16.188296 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:44:16.188261 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:44:25.086967 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:44:25.086874 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:44:32.995687 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:44:32.995646 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:44:50.198758 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:44:50.198721 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:44:58.588722 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:44:58.588686 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:45:07.490830 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:45:07.490793 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:45:15.792160 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:45:15.792121 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:45:25.191639 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:45:25.191604 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:45:33.189182 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:45:33.189143 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:45:41.893752 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:45:41.893718 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:45:53.489484 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:45:53.489409 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:46:03.093605 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:46:03.093566 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:46:16.090076 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:46:16.090040 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:46:25.593370 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:46:25.593337 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:46:31.555582 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:46:31.555551 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:46:31.558799 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:46:31.558780 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:46:33.398623 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:46:33.398586 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:46:40.994021 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:46:40.993983 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:46:50.490582 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:46:50.490541 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:47:06.988423 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:47:06.988385 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:47:14.998107 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:47:14.998074 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:47:23.146083 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:47:23.145993 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:47:31.890715 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:47:31.890680 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:47:54.694015 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:47:54.693978 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:48:07.697970 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:07.697915 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-56m2g"] Apr 16 15:48:13.620661 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:13.620623 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-fth9h_c40e33f2-82ed-4240-911d-e07cd3c6f7ff/manager/0.log" Apr 16 15:48:13.996317 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:13.996232 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-jslnj_2d4c80b4-e69c-47d7-b164-3260950a2215/manager/2.log" Apr 16 15:48:14.374160 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:14.374127 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-68df4b58f7-ljsh2_bff12472-42d3-46f3-8d9e-a83c2d3ed06c/manager/0.log" Apr 16 15:48:15.880630 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:15.880572 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-2j82l_80d2cb48-658d-409a-bf79-90e5fc42ee50/manager/0.log" Apr 16 15:48:16.448697 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:16.448664 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-56m2g_b3287a5b-39e8-4940-b76e-825d9517dcd1/limitador/0.log" Apr 16 15:48:17.025073 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:17.025034 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4rtzg_62e19e13-4da7-4de9-b6d5-caf54792d3fc/discovery/0.log" Apr 16 15:48:17.132195 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:17.132160 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5bb547c98c-jlk6k_e663dcf5-78f2-490b-94b7-cedf08e5a957/kube-auth-proxy/0.log" Apr 16 15:48:17.928500 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:17.928469 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-jf9j6_6b6616a1-20fd-4969-b123-c15ee9ad8add/storage-initializer/0.log" Apr 16 15:48:17.936265 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:17.936241 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-jf9j6_6b6616a1-20fd-4969-b123-c15ee9ad8add/main/0.log" Apr 16 15:48:18.159136 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:18.159101 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq_2db1ce9c-0141-4f42-a7f3-947057f04385/main/0.log" Apr 16 15:48:18.165781 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:18.165759 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc7rxqq_2db1ce9c-0141-4f42-a7f3-947057f04385/storage-initializer/0.log" Apr 16 15:48:18.277632 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:18.277549 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns_3285923c-92f4-489a-ac6a-0054bda94650/storage-initializer/0.log" Apr 16 15:48:18.285005 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:18.284977 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-5zzns_3285923c-92f4-489a-ac6a-0054bda94650/main/0.log" Apr 16 15:48:24.864179 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:24.864148 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pg2mn_52595484-a093-4a5b-8052-226e00ba9507/global-pull-secret-syncer/0.log" Apr 16 15:48:24.944654 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:24.944622 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kn9zn_27ceed8c-3179-48b7-9f1d-9d9a245ded1e/konnectivity-agent/0.log" Apr 16 15:48:25.017605 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:25.017554 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-252.ec2.internal_9504c7dace0e1dd78455cb89197ac884/haproxy/0.log" Apr 16 15:48:29.158217 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:29.158183 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-2j82l_80d2cb48-658d-409a-bf79-90e5fc42ee50/manager/0.log" Apr 16 15:48:29.353140 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:29.353114 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-56m2g_b3287a5b-39e8-4940-b76e-825d9517dcd1/limitador/0.log" Apr 16 15:48:31.327550 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:31.327504 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-frdt2_06d73659-193b-47fd-b466-a7decbc45ca9/node-exporter/0.log" Apr 16 15:48:31.348809 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:31.348781 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-frdt2_06d73659-193b-47fd-b466-a7decbc45ca9/kube-rbac-proxy/0.log" Apr 16 15:48:31.368215 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:31.368191 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-frdt2_06d73659-193b-47fd-b466-a7decbc45ca9/init-textfile/0.log" Apr 16 15:48:33.142774 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.142741 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-tzrgf_a7946e2b-4899-4e79-8237-f8184b28abd7/networking-console-plugin/0.log" Apr 16 15:48:33.410128 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.410053 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc"] Apr 16 15:48:33.413515 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.413494 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.416938 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.416905 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-t57q5\"/\"default-dockercfg-88qxm\"" Apr 16 15:48:33.418146 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.418130 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t57q5\"/\"kube-root-ca.crt\"" Apr 16 15:48:33.418274 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.418254 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t57q5\"/\"openshift-service-ca.crt\"" Apr 16 15:48:33.423157 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.423135 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc"] Apr 16 15:48:33.504223 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.504188 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-podres\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.504404 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.504235 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-proc\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.504404 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.504312 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-lib-modules\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.504404 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.504357 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-sys\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.504555 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.504405 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk7k8\" (UniqueName: \"kubernetes.io/projected/a2b56627-c3e7-4874-81cb-0620bc80192e-kube-api-access-wk7k8\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.605834 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.605801 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-podres\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.606035 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.605843 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-proc\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.606035 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.605888 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-lib-modules\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.606035 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.605910 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-sys\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.606035 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.605963 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk7k8\" (UniqueName: \"kubernetes.io/projected/a2b56627-c3e7-4874-81cb-0620bc80192e-kube-api-access-wk7k8\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.606035 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.605995 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-proc\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.606035 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.605995 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-podres\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.606035 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.606034 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-sys\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.606299 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.606042 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2b56627-c3e7-4874-81cb-0620bc80192e-lib-modules\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.613048 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.613027 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk7k8\" (UniqueName: \"kubernetes.io/projected/a2b56627-c3e7-4874-81cb-0620bc80192e-kube-api-access-wk7k8\") pod \"perf-node-gather-daemonset-lqqqc\" (UID: \"a2b56627-c3e7-4874-81cb-0620bc80192e\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.724135 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.724049 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:33.842677 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.842652 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc"] Apr 16 15:48:33.845194 ip-10-0-135-252 kubenswrapper[2567]: W0416 15:48:33.845165 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda2b56627_c3e7_4874_81cb_0620bc80192e.slice/crio-a785b1349add6f42c5fcce1b168ae6fecde2581549d5eb7bbaa29ac0aee531f7 WatchSource:0}: Error finding container a785b1349add6f42c5fcce1b168ae6fecde2581549d5eb7bbaa29ac0aee531f7: Status 404 returned error can't find the container with id a785b1349add6f42c5fcce1b168ae6fecde2581549d5eb7bbaa29ac0aee531f7 Apr 16 15:48:33.846830 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:33.846813 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:48:34.510191 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:34.510157 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" event={"ID":"a2b56627-c3e7-4874-81cb-0620bc80192e","Type":"ContainerStarted","Data":"9a5eec1fab22bef5b84512fcdf98e41f8ee9c070c5609ad893a139cc324a7740"} Apr 16 15:48:34.510191 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:34.510195 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" event={"ID":"a2b56627-c3e7-4874-81cb-0620bc80192e","Type":"ContainerStarted","Data":"a785b1349add6f42c5fcce1b168ae6fecde2581549d5eb7bbaa29ac0aee531f7"} Apr 16 15:48:34.510722 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:34.510359 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:34.525703 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:34.525644 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" podStartSLOduration=1.525626478 podStartE2EDuration="1.525626478s" podCreationTimestamp="2026-04-16 15:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:48:34.524283157 +0000 UTC m=+2223.553854535" watchObservedRunningTime="2026-04-16 15:48:34.525626478 +0000 UTC m=+2223.555197860" Apr 16 15:48:35.582997 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:35.582967 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9vkww_aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9/dns/0.log" Apr 16 15:48:35.602722 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:35.602702 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9vkww_aa9b0fcd-3a0a-43a8-8c23-a08827fb11b9/kube-rbac-proxy/0.log" Apr 16 15:48:35.714033 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:35.714007 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-j9vwn_adf454f4-3a18-4824-b7ac-7736800ea721/dns-node-resolver/0.log" Apr 16 15:48:36.256385 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:36.256346 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h97tg_2da4fddf-5318-4d67-9672-73870158cdf2/node-ca/0.log" Apr 16 15:48:37.213547 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:37.213512 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4rtzg_62e19e13-4da7-4de9-b6d5-caf54792d3fc/discovery/0.log" Apr 16 15:48:37.232606 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:37.232577 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5bb547c98c-jlk6k_e663dcf5-78f2-490b-94b7-cedf08e5a957/kube-auth-proxy/0.log" Apr 16 15:48:37.829877 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:37.829837 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6cpbg_21ef67e8-6503-4ba6-b6ed-bc1016b3958d/serve-healthcheck-canary/0.log" Apr 16 15:48:38.319498 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:38.319472 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2lhkw_423116e6-4cda-4fd6-8ffa-c21a25175327/kube-rbac-proxy/0.log" Apr 16 15:48:38.339223 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:38.339200 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2lhkw_423116e6-4cda-4fd6-8ffa-c21a25175327/exporter/0.log" Apr 16 15:48:38.364086 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:38.364054 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2lhkw_423116e6-4cda-4fd6-8ffa-c21a25175327/extractor/0.log" Apr 16 15:48:40.305620 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:40.305585 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-fth9h_c40e33f2-82ed-4240-911d-e07cd3c6f7ff/manager/0.log" Apr 16 15:48:40.416797 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:40.416760 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-jslnj_2d4c80b4-e69c-47d7-b164-3260950a2215/manager/1.log" Apr 16 15:48:40.436469 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:40.436436 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-jslnj_2d4c80b4-e69c-47d7-b164-3260950a2215/manager/2.log" Apr 16 15:48:40.525265 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:40.525221 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-lqqqc" Apr 16 15:48:40.551320 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:40.551285 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-68df4b58f7-ljsh2_bff12472-42d3-46f3-8d9e-a83c2d3ed06c/manager/0.log" Apr 16 15:48:47.745612 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:47.745583 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7kth7_c25330d1-d516-4851-8786-0d9a8e235f7d/kube-multus-additional-cni-plugins/0.log" Apr 16 15:48:47.764988 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:47.764964 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7kth7_c25330d1-d516-4851-8786-0d9a8e235f7d/egress-router-binary-copy/0.log" Apr 16 15:48:47.784055 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:47.784035 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7kth7_c25330d1-d516-4851-8786-0d9a8e235f7d/cni-plugins/0.log" Apr 16 15:48:47.803662 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:47.803638 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7kth7_c25330d1-d516-4851-8786-0d9a8e235f7d/bond-cni-plugin/0.log" Apr 16 15:48:47.823101 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:47.823079 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7kth7_c25330d1-d516-4851-8786-0d9a8e235f7d/routeoverride-cni/0.log" Apr 16 15:48:47.843326 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:47.843299 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7kth7_c25330d1-d516-4851-8786-0d9a8e235f7d/whereabouts-cni-bincopy/0.log" Apr 16 15:48:47.862641 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:47.862613 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7kth7_c25330d1-d516-4851-8786-0d9a8e235f7d/whereabouts-cni/0.log" Apr 16 15:48:48.045473 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:48.045396 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nll6d_da21a633-b9e5-4a37-b9bc-12d29b6b666b/kube-multus/0.log" Apr 16 15:48:48.178536 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:48.178501 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x8njb_ba267359-2c95-4792-991e-a2e9eae5b290/network-metrics-daemon/0.log" Apr 16 15:48:48.196493 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:48.196465 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x8njb_ba267359-2c95-4792-991e-a2e9eae5b290/kube-rbac-proxy/0.log" Apr 16 15:48:49.345671 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:49.345642 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-controller/0.log" Apr 16 15:48:49.365753 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:49.365724 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/0.log" Apr 16 15:48:49.383214 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:49.383164 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovn-acl-logging/1.log" Apr 16 15:48:49.421507 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:49.421478 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/kube-rbac-proxy-node/0.log" Apr 16 15:48:49.458447 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:49.458420 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:48:49.480152 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:49.480120 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/northd/0.log" Apr 16 15:48:49.512819 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:49.512796 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/nbdb/0.log" Apr 16 15:48:49.547610 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:49.547583 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/sbdb/0.log" Apr 16 15:48:49.710151 ip-10-0-135-252 kubenswrapper[2567]: I0416 15:48:49.710073 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jzvn_d93e980a-c222-444d-a15d-49cf63ac1c76/ovnkube-controller/0.log"