Apr 16 10:05:25.049456 ip-10-0-135-215 systemd[1]: Starting Kubernetes Kubelet... Apr 16 10:05:25.482009 ip-10-0-135-215 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 10:05:25.482009 ip-10-0-135-215 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 10:05:25.482009 ip-10-0-135-215 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 10:05:25.482009 ip-10-0-135-215 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 10:05:25.482009 ip-10-0-135-215 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 10:05:25.482934 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.482789 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 10:05:25.485883 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485862 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:05:25.485883 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485879 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:05:25.485883 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485884 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:05:25.485883 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485888 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485900 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485905 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485909 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485913 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485917 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485920 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485924 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485928 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485932 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485936 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485940 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485944 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485948 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485951 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485955 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485959 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485963 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485968 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485971 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:05:25.486141 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485975 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485980 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485984 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485988 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485993 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.485996 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486000 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486005 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486009 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486013 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486019 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486023 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486028 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486033 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486038 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486042 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486046 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486051 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486055 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:05:25.486916 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486059 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486064 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486069 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486073 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486077 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486082 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486086 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486090 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486094 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486098 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486102 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486106 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486111 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486115 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486122 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486128 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486136 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486140 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486144 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486148 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:05:25.487422 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486167 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486172 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486176 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486180 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486184 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486188 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486192 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486199 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486203 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486207 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486211 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486215 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486220 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486224 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486228 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486232 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486236 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486241 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486245 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:05:25.487989 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486252 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486258 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486263 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486268 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.486272 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487711 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487725 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487729 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487732 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487735 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487738 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487742 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487745 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487748 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487751 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487754 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487757 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487762 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487765 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487767 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:05:25.488560 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487770 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487773 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487776 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487779 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487782 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487785 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487788 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487791 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487794 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487797 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487799 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487802 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487805 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487807 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487811 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487814 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487817 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487819 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487824 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487827 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:05:25.489051 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487830 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487833 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487836 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487839 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487841 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487844 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487847 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487849 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487852 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487855 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487857 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487860 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487862 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487865 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487868 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487870 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487873 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487875 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487878 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487881 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:05:25.489572 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487883 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487886 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487888 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487891 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487895 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487899 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487902 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487905 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487908 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487911 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487916 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487921 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487925 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487928 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487930 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487933 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487936 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487938 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487941 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:05:25.490064 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487944 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487946 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487949 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487951 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487954 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487957 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487959 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487962 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487964 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487967 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487969 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.487972 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488047 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488055 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488063 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488068 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488072 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488076 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488080 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488085 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488089 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 10:05:25.490583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488092 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488096 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488101 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488104 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488108 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488111 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488114 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488117 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488120 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488123 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488127 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488130 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488134 2575 flags.go:64] FLAG: --config-dir="" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488136 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488140 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488144 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488147 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488165 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488169 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488172 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488175 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488178 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488182 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488185 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488189 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 10:05:25.491107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488192 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488195 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488198 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488201 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488204 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488210 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488213 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488216 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488220 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488223 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488227 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488230 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488234 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488237 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488240 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488243 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488246 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488249 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488252 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488255 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488257 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488261 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488265 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488268 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488271 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488281 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 10:05:25.491765 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488285 2575 flags.go:64] FLAG: --help="false" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488289 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488292 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488295 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488298 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488302 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488305 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488308 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488311 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488314 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488317 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488320 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488323 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488326 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488329 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488333 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488336 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488339 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488342 2575 flags.go:64] FLAG: --lock-file="" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488346 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488349 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488352 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488358 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 10:05:25.492413 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488361 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488364 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488368 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488370 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488374 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488377 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488380 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488384 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488388 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488392 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488395 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488398 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488401 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488405 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488408 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488411 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488414 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488422 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488425 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488428 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488432 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488434 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488440 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488443 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 10:05:25.492996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488446 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488449 2575 flags.go:64] FLAG: --port="10250" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488452 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488455 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0399b8c6621fc131e" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488459 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488463 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488466 2575 flags.go:64] FLAG: --register-node="true" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488469 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488472 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488475 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488478 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488481 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488485 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488489 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488492 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488495 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488498 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488500 2575 flags.go:64] FLAG: --runonce="false" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488503 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488506 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488510 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488513 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488516 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488519 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488522 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488526 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 10:05:25.493638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488529 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488532 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488535 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488538 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488541 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488544 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488548 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488554 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488556 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488559 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488564 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488567 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488571 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488574 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488577 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488581 2575 flags.go:64] FLAG: --v="2" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488585 2575 flags.go:64] FLAG: --version="false" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488589 2575 flags.go:64] FLAG: --vmodule="" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488594 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.488597 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488690 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488693 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488696 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488699 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:05:25.494281 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488702 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488704 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488707 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488709 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488712 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488715 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488717 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488725 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488729 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488731 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488734 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488736 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488739 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488742 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488744 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488747 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488750 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488752 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488755 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488757 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:05:25.494885 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488760 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488763 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488766 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488770 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488774 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488776 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488779 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488782 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488784 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488787 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488789 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488792 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488795 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488798 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488800 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488803 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488806 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488810 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488813 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:05:25.495509 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488818 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488821 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488824 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488826 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488829 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488832 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488835 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488838 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488840 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488843 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488846 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488848 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488851 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488853 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488856 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488859 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488862 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488864 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488867 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488870 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:05:25.496283 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488872 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488875 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488877 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488880 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488882 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488885 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488887 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488890 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488892 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488895 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488897 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488900 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488906 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488908 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488911 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488913 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488916 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488918 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488921 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488923 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:05:25.496780 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488926 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488928 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.488931 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.489649 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.497214 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.497233 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497284 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497289 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497293 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497296 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497299 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497302 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497306 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497309 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497312 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497315 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:05:25.497409 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497318 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497321 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497324 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497327 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497329 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497333 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497335 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497339 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497342 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497344 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497347 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497350 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497352 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497355 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497357 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497360 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497363 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497366 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497370 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497374 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:05:25.497817 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497378 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497381 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497384 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497387 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497390 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497393 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497395 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497398 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497401 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497403 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497406 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497409 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497411 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497414 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497417 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497419 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497424 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497427 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497430 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497434 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:05:25.498348 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497436 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497440 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497443 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497445 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497448 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497451 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497454 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497457 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497460 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497463 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497465 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497468 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497470 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497474 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497477 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497480 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497483 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497485 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497488 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497490 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:05:25.498844 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497493 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497496 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497498 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497500 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497503 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497506 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497508 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497511 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497514 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497516 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497519 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497521 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497524 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497526 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497529 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:05:25.499356 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497531 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.497537 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497638 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497642 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497645 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497648 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497651 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497654 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497656 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497659 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497662 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497665 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497668 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497670 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497673 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497676 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:05:25.499720 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497679 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497681 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497684 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497687 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497689 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497692 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497695 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497698 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497701 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497704 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497706 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497709 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497711 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497714 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497716 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497718 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497721 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497724 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497726 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:05:25.500117 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497729 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497731 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497734 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497736 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497739 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497741 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497744 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497747 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497751 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497754 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497756 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497759 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497762 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497764 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497767 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497769 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497772 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497775 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497777 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497780 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:05:25.500614 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497782 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497784 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497789 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497792 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497794 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497797 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497799 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497802 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497804 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497807 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497809 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497812 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497815 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497817 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497820 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497823 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497826 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497828 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497831 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:05:25.501106 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497835 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497839 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497842 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497845 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497848 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497850 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497853 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497856 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497859 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497861 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497865 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497868 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497871 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:25.497874 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.497879 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 10:05:25.501590 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.498533 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 10:05:25.502637 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.502622 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 10:05:25.503545 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.503534 2575 server.go:1019] "Starting client certificate rotation" Apr 16 10:05:25.503645 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.503627 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 10:05:25.503680 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.503672 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 10:05:25.526650 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.526626 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 10:05:25.529107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.529089 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 10:05:25.542382 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.542356 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 10:05:25.549702 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.549680 2575 log.go:25] "Validated CRI v1 image API" Apr 16 10:05:25.551128 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.551105 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 10:05:25.554721 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.554698 2575 fs.go:135] Filesystem UUIDs: map[60a94afd-3f6a-41c8-b057-7df6588756cd:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c33b0199-c464-4838-a29e-70bebd2c5cda:/dev/nvme0n1p4] Apr 16 10:05:25.554809 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.554719 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 10:05:25.556960 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.556943 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 10:05:25.560800 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.560691 2575 manager.go:217] Machine: {Timestamp:2026-04-16 10:05:25.558474154 +0000 UTC m=+0.395077335 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098597 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a3a937867843c5d2aea6c93c454e9 SystemUUID:ec2a3a93-7867-843c-5d2a-ea6c93c454e9 BootID:7632c538-85b9-466b-a97e-96bac1913a5f Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9b:fc:0f:5a:5b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9b:fc:0f:5a:5b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:2f:f7:34:3d:d5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 10:05:25.560800 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.560795 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 10:05:25.560913 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.560883 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 10:05:25.561942 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.561917 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 10:05:25.562087 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.561944 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-215.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 10:05:25.562134 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.562096 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 10:05:25.562134 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.562105 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 10:05:25.562134 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.562118 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 10:05:25.562865 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.562854 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 10:05:25.563711 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.563700 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 10:05:25.563822 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.563813 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 10:05:25.567288 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.567276 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 10:05:25.568012 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.568002 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 10:05:25.568058 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.568021 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 10:05:25.568058 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.568031 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 10:05:25.568058 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.568040 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 10:05:25.569352 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.569337 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 10:05:25.569408 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.569365 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 10:05:25.572646 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.572629 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 10:05:25.574897 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.574883 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 10:05:25.576298 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576286 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 10:05:25.576348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576303 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 10:05:25.576348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576310 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 10:05:25.576348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576320 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 10:05:25.576348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576327 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 10:05:25.576348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576333 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 10:05:25.576348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576339 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 10:05:25.576348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576344 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 10:05:25.576348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576351 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 10:05:25.576549 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576357 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 10:05:25.576549 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576366 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 10:05:25.576549 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.576375 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 10:05:25.577589 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.577572 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hkz9t" Apr 16 10:05:25.578012 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.578002 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 10:05:25.578042 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.578014 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 10:05:25.578573 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.578551 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 10:05:25.578630 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.578551 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-215.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 10:05:25.581880 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.581867 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 10:05:25.581930 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.581906 2575 server.go:1295] "Started kubelet" Apr 16 10:05:25.582073 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.582011 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 10:05:25.582145 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.582115 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 10:05:25.582350 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.582031 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 10:05:25.582680 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.582658 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hkz9t" Apr 16 10:05:25.582939 ip-10-0-135-215 systemd[1]: Started Kubernetes Kubelet. Apr 16 10:05:25.583054 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.583040 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 10:05:25.584836 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.584822 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 10:05:25.590592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.590574 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 10:05:25.590592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.590586 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 10:05:25.591277 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.591261 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 10:05:25.591277 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.591264 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 10:05:25.591401 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.591284 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 10:05:25.591401 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.591373 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 10:05:25.591401 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.591379 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 10:05:25.591401 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.591389 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:25.591867 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.591839 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 10:05:25.592235 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.592219 2575 factory.go:153] Registering CRI-O factory Apr 16 10:05:25.592322 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.592240 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 10:05:25.592322 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.592306 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 10:05:25.592322 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.592315 2575 factory.go:55] Registering systemd factory Apr 16 10:05:25.592322 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.592323 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 10:05:25.592493 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.592344 2575 factory.go:103] Registering Raw factory Apr 16 10:05:25.592493 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.592354 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 10:05:25.592767 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.592748 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:05:25.592928 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.592913 2575 manager.go:319] Starting recovery of all containers Apr 16 10:05:25.594949 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.594925 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-215.ec2.internal" not found Apr 16 10:05:25.595997 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.595973 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-215.ec2.internal\" not found" node="ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.601140 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.601030 2575 manager.go:324] Recovery completed Apr 16 10:05:25.605261 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.605246 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:05:25.609135 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.609120 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:05:25.609228 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.609163 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:05:25.609228 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.609178 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:05:25.609670 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.609653 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 10:05:25.609670 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.609668 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 10:05:25.609770 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.609684 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 10:05:25.610094 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.610081 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-215.ec2.internal" not found Apr 16 10:05:25.612043 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.612032 2575 policy_none.go:49] "None policy: Start" Apr 16 10:05:25.612095 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.612047 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 10:05:25.612095 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.612056 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 10:05:25.644738 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.644723 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 10:05:25.658862 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.644753 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 10:05:25.658862 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.644763 2575 server.go:85] "Starting device plugin registration server" Apr 16 10:05:25.658862 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.645063 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 10:05:25.658862 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.645074 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 10:05:25.658862 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.645236 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 10:05:25.658862 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.645315 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 10:05:25.658862 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.645332 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 10:05:25.658862 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.646234 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 10:05:25.658862 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.646313 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:25.665825 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.665803 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-215.ec2.internal" not found Apr 16 10:05:25.726812 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.726772 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 10:05:25.728016 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.727997 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 10:05:25.728096 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.728033 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 10:05:25.728096 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.728057 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 10:05:25.728096 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.728066 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 10:05:25.728247 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.728110 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 10:05:25.731226 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.731201 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:05:25.746239 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.746191 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:05:25.747080 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.747065 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:05:25.747148 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.747093 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:05:25.747148 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.747104 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:05:25.747148 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.747128 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.756643 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.756626 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.756688 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.756647 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-215.ec2.internal\": node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:25.777116 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.777091 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:25.828481 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.828447 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal"] Apr 16 10:05:25.828586 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.828560 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:05:25.830324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.830304 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:05:25.830440 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.830336 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:05:25.830440 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.830350 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:05:25.831809 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.831792 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:05:25.831983 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.831967 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.832028 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.831998 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:05:25.836683 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.836663 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:05:25.836787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.836684 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:05:25.836787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.836693 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:05:25.836787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.836703 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:05:25.836787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.836704 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:05:25.836787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.836717 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:05:25.838524 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.838506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.838567 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.838543 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:05:25.839324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.839308 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:05:25.839424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.839335 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:05:25.839424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.839347 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:05:25.863036 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.863008 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-215.ec2.internal\" not found" node="ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.867408 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.867388 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-215.ec2.internal\" not found" node="ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.877436 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.877421 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:25.892489 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.892467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/63ef4af2b99ce412233ebbc0edd9cbf1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal\" (UID: \"63ef4af2b99ce412233ebbc0edd9cbf1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.892558 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.892497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63ef4af2b99ce412233ebbc0edd9cbf1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal\" (UID: \"63ef4af2b99ce412233ebbc0edd9cbf1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.892558 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.892518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/42c5bf6a0ae41a4c6f004afe4db8cd52-config\") pod \"kube-apiserver-proxy-ip-10-0-135-215.ec2.internal\" (UID: \"42c5bf6a0ae41a4c6f004afe4db8cd52\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.977798 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:25.977766 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:25.993071 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.993049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/63ef4af2b99ce412233ebbc0edd9cbf1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal\" (UID: \"63ef4af2b99ce412233ebbc0edd9cbf1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.993130 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.993079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63ef4af2b99ce412233ebbc0edd9cbf1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal\" (UID: \"63ef4af2b99ce412233ebbc0edd9cbf1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.993130 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.993101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/42c5bf6a0ae41a4c6f004afe4db8cd52-config\") pod \"kube-apiserver-proxy-ip-10-0-135-215.ec2.internal\" (UID: \"42c5bf6a0ae41a4c6f004afe4db8cd52\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.993215 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.993138 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/63ef4af2b99ce412233ebbc0edd9cbf1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal\" (UID: \"63ef4af2b99ce412233ebbc0edd9cbf1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.993215 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.993169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63ef4af2b99ce412233ebbc0edd9cbf1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal\" (UID: \"63ef4af2b99ce412233ebbc0edd9cbf1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" Apr 16 10:05:25.993215 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:25.993202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/42c5bf6a0ae41a4c6f004afe4db8cd52-config\") pod \"kube-apiserver-proxy-ip-10-0-135-215.ec2.internal\" (UID: \"42c5bf6a0ae41a4c6f004afe4db8cd52\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal" Apr 16 10:05:26.078521 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:26.078446 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:26.166973 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.166945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" Apr 16 10:05:26.170531 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.170511 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal" Apr 16 10:05:26.179103 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:26.179085 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:26.279627 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:26.279585 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:26.380146 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:26.380059 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:26.480767 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:26.480734 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:26.499199 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.499175 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:05:26.503987 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.503969 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 10:05:26.504132 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.504115 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 10:05:26.504222 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.504125 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 10:05:26.504222 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.504131 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 10:05:26.581605 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:26.581577 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:26.584735 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.584710 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 10:00:25 +0000 UTC" deadline="2027-11-13 09:50:01.189209021 +0000 UTC" Apr 16 10:05:26.584735 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.584734 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13823h44m34.604477209s" Apr 16 10:05:26.590861 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.590838 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 10:05:26.607887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.607858 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 10:05:26.630135 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.630107 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hq6k9" Apr 16 10:05:26.636441 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.636385 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hq6k9" Apr 16 10:05:26.681846 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:26.681822 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:26.691131 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:26.691093 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c5bf6a0ae41a4c6f004afe4db8cd52.slice/crio-dbb4bed1f2259a9625272a31fd996a260e142f12dea6d6ef32f9d06c252086ae WatchSource:0}: Error finding container dbb4bed1f2259a9625272a31fd996a260e142f12dea6d6ef32f9d06c252086ae: Status 404 returned error can't find the container with id dbb4bed1f2259a9625272a31fd996a260e142f12dea6d6ef32f9d06c252086ae Apr 16 10:05:26.691725 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:26.691697 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ef4af2b99ce412233ebbc0edd9cbf1.slice/crio-a5d623dd0fa69ac59f3161af496dd2f35e03d71a054840da81e9c6b944b27caa WatchSource:0}: Error finding container a5d623dd0fa69ac59f3161af496dd2f35e03d71a054840da81e9c6b944b27caa: Status 404 returned error can't find the container with id a5d623dd0fa69ac59f3161af496dd2f35e03d71a054840da81e9c6b944b27caa Apr 16 10:05:26.695675 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.695659 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:05:26.731538 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.731480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal" event={"ID":"42c5bf6a0ae41a4c6f004afe4db8cd52","Type":"ContainerStarted","Data":"dbb4bed1f2259a9625272a31fd996a260e142f12dea6d6ef32f9d06c252086ae"} Apr 16 10:05:26.732416 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.732389 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" event={"ID":"63ef4af2b99ce412233ebbc0edd9cbf1","Type":"ContainerStarted","Data":"a5d623dd0fa69ac59f3161af496dd2f35e03d71a054840da81e9c6b944b27caa"} Apr 16 10:05:26.782604 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:26.782573 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:26.883092 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:26.883060 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-215.ec2.internal\" not found" Apr 16 10:05:26.941049 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.940974 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:05:26.990983 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:26.990955 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" Apr 16 10:05:27.004396 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.004369 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 10:05:27.005306 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.005290 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal" Apr 16 10:05:27.011694 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.011677 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 10:05:27.568832 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.568799 2575 apiserver.go:52] "Watching apiserver" Apr 16 10:05:27.576534 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.576507 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 10:05:27.577734 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.577700 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8r8s4","openshift-network-diagnostics/network-check-target-rvbk5","openshift-cluster-node-tuning-operator/tuned-xzl5j","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal","openshift-multus/multus-d5h7c","openshift-network-operator/iptables-alerter-h8xhq","openshift-ovn-kubernetes/ovnkube-node-g7t7h","kube-system/konnectivity-agent-cjccl","kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x","openshift-dns/node-resolver-gjdtj","openshift-image-registry/node-ca-rv2dg","openshift-multus/multus-additional-cni-plugins-kvzsv"] Apr 16 10:05:27.579399 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.579377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.580507 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.580487 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.581401 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.581377 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 10:05:27.582189 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.581864 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sb49x\"" Apr 16 10:05:27.584457 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.582385 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 10:05:27.584457 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.582548 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.584457 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.582712 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 10:05:27.584457 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.583458 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 10:05:27.584457 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.583468 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 10:05:27.584457 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.583780 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 10:05:27.584457 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.583843 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fdrpx\"" Apr 16 10:05:27.584457 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.584187 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 10:05:27.586043 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.585297 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:05:27.586043 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.585541 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dg9zt\"" Apr 16 10:05:27.586043 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.585547 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 10:05:27.588585 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.588446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:27.588585 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:27.588514 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:27.589999 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.589586 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.589999 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.589674 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.591089 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.591071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.592363 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.592288 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 10:05:27.592363 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.592297 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 10:05:27.592544 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.592515 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 10:05:27.592844 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.592700 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 10:05:27.592844 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.592782 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7nsgk\"" Apr 16 10:05:27.592844 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.592817 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qntdr\"" Apr 16 10:05:27.593140 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.593106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 10:05:27.593232 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.593214 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nrs6w\"" Apr 16 10:05:27.593462 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.593431 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 10:05:27.593520 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.593458 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 10:05:27.593520 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.593499 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 10:05:27.594336 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.594147 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 10:05:27.594433 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.594373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:27.594825 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.594686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.594903 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.594688 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 10:05:27.596497 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.596321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.596589 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.596507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-tglkg\"" Apr 16 10:05:27.596649 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.596606 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 10:05:27.596649 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.596631 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 10:05:27.596816 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.596797 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2kc5f\"" Apr 16 10:05:27.597273 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.597227 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 10:05:27.597447 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.597413 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 10:05:27.597557 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.597534 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:27.597621 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.597548 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 10:05:27.597674 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:27.597617 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:27.598323 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.598267 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:05:27.599350 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.599091 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 10:05:27.599350 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.599180 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jn6ct\"" Apr 16 10:05:27.599350 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.599286 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 10:05:27.601681 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-tuned\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.601784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-sys\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.601784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-host\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.601784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-run-netns\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.601784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w497f\" (UniqueName: \"kubernetes.io/projected/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-kube-api-access-w497f\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.602002 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-run\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.602002 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-run-ovn-kubernetes\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602002 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a82420b-dfa7-4776-ada5-5f24f6e237d2-hosts-file\") pod \"node-resolver-gjdtj\" (UID: \"8a82420b-dfa7-4776-ada5-5f24f6e237d2\") " pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.602002 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-var-lib-kubelet\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.602002 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-cnibin\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.602002 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.601968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-system-cni-dir\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-sysctl-conf\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-system-cni-dir\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602078 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-run-k8s-cni-cncf-io\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602103 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-sysconfig\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-kubernetes\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-run-openvswitch\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-node-log\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e21bf7bf-cce9-4deb-8977-30b0f4341386-ovnkube-config\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-hostroot\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.602338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-cnibin\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-sysctl-d\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-var-lib-openvswitch\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-run-ovn\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e21bf7bf-cce9-4deb-8977-30b0f4341386-ovn-node-metrics-cert\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-multus-socket-dir-parent\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-run-netns\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-etc-kubernetes\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f990235-90f2-4344-9fbc-4a60dac858ad-host\") pod \"node-ca-rv2dg\" (UID: \"6f990235-90f2-4344-9fbc-4a60dac858ad\") " pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-etc-openvswitch\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed2a524a-708b-4bef-af2f-4363358430af-cni-binary-copy\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-modprobe-d\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-kubelet\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-slash\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602830 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e21bf7bf-cce9-4deb-8977-30b0f4341386-env-overrides\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5bf46483-1c9c-44d9-8737-763a361a473f-agent-certs\") pod \"konnectivity-agent-cjccl\" (UID: \"5bf46483-1c9c-44d9-8737-763a361a473f\") " pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:27.602994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a65d6d-c433-41e5-a73d-38b4e9860935-tmp\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmqwd\" (UniqueName: \"kubernetes.io/projected/99a65d6d-c433-41e5-a73d-38b4e9860935-kube-api-access-bmqwd\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.602937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-systemd-units\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-multus-conf-dir\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603053 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-var-lib-cni-bin\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6j6p\" (UniqueName: \"kubernetes.io/projected/6f990235-90f2-4344-9fbc-4a60dac858ad-kube-api-access-j6j6p\") pod \"node-ca-rv2dg\" (UID: \"6f990235-90f2-4344-9fbc-4a60dac858ad\") " pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-log-socket\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-run-multus-certs\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-multus-cni-dir\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6f990235-90f2-4344-9fbc-4a60dac858ad-serviceca\") pod \"node-ca-rv2dg\" (UID: \"6f990235-90f2-4344-9fbc-4a60dac858ad\") " pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603653 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-cni-bin\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-cni-netd\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603704 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-os-release\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.603817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603729 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed2a524a-708b-4bef-af2f-4363358430af-multus-daemon-config\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2jxm\" (UniqueName: \"kubernetes.io/projected/8a82420b-dfa7-4776-ada5-5f24f6e237d2-kube-api-access-j2jxm\") pod \"node-resolver-gjdtj\" (UID: \"8a82420b-dfa7-4776-ada5-5f24f6e237d2\") " pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-run-systemd\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5bf46483-1c9c-44d9-8737-763a361a473f-konnectivity-ca\") pod \"konnectivity-agent-cjccl\" (UID: \"5bf46483-1c9c-44d9-8737-763a361a473f\") " pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgcx8\" (UniqueName: \"kubernetes.io/projected/e21bf7bf-cce9-4deb-8977-30b0f4341386-kube-api-access-pgcx8\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603872 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-os-release\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-var-lib-cni-multus\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.603990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.604013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e21bf7bf-cce9-4deb-8977-30b0f4341386-ovnkube-script-lib\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.604036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl88z\" (UniqueName: \"kubernetes.io/projected/1591f776-8015-497b-a4bf-80b359c62427-kube-api-access-fl88z\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.604060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7w6r\" (UniqueName: \"kubernetes.io/projected/ed2a524a-708b-4bef-af2f-4363358430af-kube-api-access-f7w6r\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.604083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-systemd\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.604106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-var-lib-kubelet\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.604129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a82420b-dfa7-4776-ada5-5f24f6e237d2-tmp-dir\") pod \"node-resolver-gjdtj\" (UID: \"8a82420b-dfa7-4776-ada5-5f24f6e237d2\") " pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.604592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.604168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-lib-modules\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.637236 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.637204 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 10:00:26 +0000 UTC" deadline="2027-12-30 22:40:14.8333354 +0000 UTC" Apr 16 10:05:27.637236 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.637236 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14964h34m47.196102391s" Apr 16 10:05:27.672092 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.672064 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:05:27.692423 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.692398 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 10:05:27.704852 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.704814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-run\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.705026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.704870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-run-ovn-kubernetes\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.705026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.704897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a82420b-dfa7-4776-ada5-5f24f6e237d2-hosts-file\") pod \"node-resolver-gjdtj\" (UID: \"8a82420b-dfa7-4776-ada5-5f24f6e237d2\") " pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.705026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.704919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-var-lib-kubelet\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.705026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.704929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-run\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.705026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.704947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8dmq\" (UniqueName: \"kubernetes.io/projected/65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f-kube-api-access-x8dmq\") pod \"iptables-alerter-h8xhq\" (UID: \"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f\") " pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.705026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.704951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-run-ovn-kubernetes\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.705026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.704972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-sys-fs\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.705026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705015 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-cnibin\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-var-lib-kubelet\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705041 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a82420b-dfa7-4776-ada5-5f24f6e237d2-hosts-file\") pod \"node-resolver-gjdtj\" (UID: \"8a82420b-dfa7-4776-ada5-5f24f6e237d2\") " pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-cnibin\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-system-cni-dir\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705141 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-sysctl-conf\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-system-cni-dir\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-run-k8s-cni-cncf-io\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-sysconfig\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-system-cni-dir\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-kubernetes\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-system-cni-dir\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-kubernetes\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-run-openvswitch\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-run-k8s-cni-cncf-io\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-run-openvswitch\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.705424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-sysconfig\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-sysctl-conf\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-node-log\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e21bf7bf-cce9-4deb-8977-30b0f4341386-ovnkube-config\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-hostroot\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-cnibin\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-sysctl-d\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-var-lib-openvswitch\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-run-ovn\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e21bf7bf-cce9-4deb-8977-30b0f4341386-ovn-node-metrics-cert\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-multus-socket-dir-parent\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-run-netns\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-etc-kubernetes\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f990235-90f2-4344-9fbc-4a60dac858ad-host\") pod \"node-ca-rv2dg\" (UID: \"6f990235-90f2-4344-9fbc-4a60dac858ad\") " pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-etc-openvswitch\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-socket-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67znt\" (UniqueName: \"kubernetes.io/projected/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-kube-api-access-67znt\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed2a524a-708b-4bef-af2f-4363358430af-cni-binary-copy\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.705935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-modprobe-d\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-modprobe-d\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-sysctl-d\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-run-netns\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e21bf7bf-cce9-4deb-8977-30b0f4341386-ovnkube-config\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-etc-kubernetes\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-node-log\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-kubelet\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-hostroot\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-etc-openvswitch\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706290 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-run-ovn\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706315 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-kubelet\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-var-lib-openvswitch\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706256 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-cnibin\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.706887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-slash\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-slash\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e21bf7bf-cce9-4deb-8977-30b0f4341386-env-overrides\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-multus-socket-dir-parent\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5bf46483-1c9c-44d9-8737-763a361a473f-agent-certs\") pod \"konnectivity-agent-cjccl\" (UID: \"5bf46483-1c9c-44d9-8737-763a361a473f\") " pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f990235-90f2-4344-9fbc-4a60dac858ad-host\") pod \"node-ca-rv2dg\" (UID: \"6f990235-90f2-4344-9fbc-4a60dac858ad\") " pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-device-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-etc-selinux\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a65d6d-c433-41e5-a73d-38b4e9860935-tmp\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmqwd\" (UniqueName: \"kubernetes.io/projected/99a65d6d-c433-41e5-a73d-38b4e9860935-kube-api-access-bmqwd\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706534 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-systemd-units\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-multus-conf-dir\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-var-lib-cni-bin\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706608 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6j6p\" (UniqueName: \"kubernetes.io/projected/6f990235-90f2-4344-9fbc-4a60dac858ad-kube-api-access-j6j6p\") pod \"node-ca-rv2dg\" (UID: \"6f990235-90f2-4344-9fbc-4a60dac858ad\") " pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-log-socket\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-run-multus-certs\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.707692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed2a524a-708b-4bef-af2f-4363358430af-cni-binary-copy\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e21bf7bf-cce9-4deb-8977-30b0f4341386-env-overrides\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706836 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-var-lib-cni-bin\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:27.706869 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-multus-conf-dir\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:27.706956 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs podName:1591f776-8015-497b-a4bf-80b359c62427 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:28.206926076 +0000 UTC m=+3.043529226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs") pod "network-metrics-daemon-8r8s4" (UID: "1591f776-8015-497b-a4bf-80b359c62427") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706957 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-log-socket\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.706873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-systemd-units\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-run-multus-certs\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-multus-cni-dir\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6f990235-90f2-4344-9fbc-4a60dac858ad-serviceca\") pod \"node-ca-rv2dg\" (UID: \"6f990235-90f2-4344-9fbc-4a60dac858ad\") " pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-cni-bin\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-multus-cni-dir\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.708541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-cni-bin\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-cni-netd\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-cni-netd\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707449 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-os-release\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed2a524a-708b-4bef-af2f-4363358430af-multus-daemon-config\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2jxm\" (UniqueName: \"kubernetes.io/projected/8a82420b-dfa7-4776-ada5-5f24f6e237d2-kube-api-access-j2jxm\") pod \"node-resolver-gjdtj\" (UID: \"8a82420b-dfa7-4776-ada5-5f24f6e237d2\") " pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-run-systemd\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-os-release\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-run-systemd\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6f990235-90f2-4344-9fbc-4a60dac858ad-serviceca\") pod \"node-ca-rv2dg\" (UID: \"6f990235-90f2-4344-9fbc-4a60dac858ad\") " pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5bf46483-1c9c-44d9-8737-763a361a473f-konnectivity-ca\") pod \"konnectivity-agent-cjccl\" (UID: \"5bf46483-1c9c-44d9-8737-763a361a473f\") " pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707628 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f-host-slash\") pod \"iptables-alerter-h8xhq\" (UID: \"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f\") " pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grv77\" (UniqueName: \"kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77\") pod \"network-check-target-rvbk5\" (UID: \"b8122b7e-3e94-4772-bea0-462846dfdfab\") " pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgcx8\" (UniqueName: \"kubernetes.io/projected/e21bf7bf-cce9-4deb-8977-30b0f4341386-kube-api-access-pgcx8\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-os-release\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-var-lib-cni-multus\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.709324 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f-iptables-alerter-script\") pod \"iptables-alerter-h8xhq\" (UID: \"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f\") " pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-os-release\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e21bf7bf-cce9-4deb-8977-30b0f4341386-ovnkube-script-lib\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-var-lib-cni-multus\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl88z\" (UniqueName: \"kubernetes.io/projected/1591f776-8015-497b-a4bf-80b359c62427-kube-api-access-fl88z\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7w6r\" (UniqueName: \"kubernetes.io/projected/ed2a524a-708b-4bef-af2f-4363358430af-kube-api-access-f7w6r\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed2a524a-708b-4bef-af2f-4363358430af-multus-daemon-config\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.707976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-systemd\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-registration-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-var-lib-kubelet\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a82420b-dfa7-4776-ada5-5f24f6e237d2-tmp-dir\") pod \"node-resolver-gjdtj\" (UID: \"8a82420b-dfa7-4776-ada5-5f24f6e237d2\") " pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-lib-modules\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5bf46483-1c9c-44d9-8737-763a361a473f-konnectivity-ca\") pod \"konnectivity-agent-cjccl\" (UID: \"5bf46483-1c9c-44d9-8737-763a361a473f\") " pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-tuned\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-sys\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.710093 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-host\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-run-netns\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w497f\" (UniqueName: \"kubernetes.io/projected/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-kube-api-access-w497f\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-systemd\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed2a524a-708b-4bef-af2f-4363358430af-host-var-lib-kubelet\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-sys\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a82420b-dfa7-4776-ada5-5f24f6e237d2-tmp-dir\") pod \"node-resolver-gjdtj\" (UID: \"8a82420b-dfa7-4776-ada5-5f24f6e237d2\") " pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e21bf7bf-cce9-4deb-8977-30b0f4341386-ovnkube-script-lib\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-host\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e21bf7bf-cce9-4deb-8977-30b0f4341386-host-run-netns\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.708589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99a65d6d-c433-41e5-a73d-38b4e9860935-lib-modules\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.710891 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a65d6d-c433-41e5-a73d-38b4e9860935-tmp\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.710942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99a65d6d-c433-41e5-a73d-38b4e9860935-etc-tuned\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.711829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.711556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5bf46483-1c9c-44d9-8737-763a361a473f-agent-certs\") pod \"konnectivity-agent-cjccl\" (UID: \"5bf46483-1c9c-44d9-8737-763a361a473f\") " pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:27.712437 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.711990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e21bf7bf-cce9-4deb-8977-30b0f4341386-ovn-node-metrics-cert\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.716127 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.716053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6j6p\" (UniqueName: \"kubernetes.io/projected/6f990235-90f2-4344-9fbc-4a60dac858ad-kube-api-access-j6j6p\") pod \"node-ca-rv2dg\" (UID: \"6f990235-90f2-4344-9fbc-4a60dac858ad\") " pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.717009 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.716982 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl88z\" (UniqueName: \"kubernetes.io/projected/1591f776-8015-497b-a4bf-80b359c62427-kube-api-access-fl88z\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:27.717610 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.717567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2jxm\" (UniqueName: \"kubernetes.io/projected/8a82420b-dfa7-4776-ada5-5f24f6e237d2-kube-api-access-j2jxm\") pod \"node-resolver-gjdtj\" (UID: \"8a82420b-dfa7-4776-ada5-5f24f6e237d2\") " pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.718694 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.718241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7w6r\" (UniqueName: \"kubernetes.io/projected/ed2a524a-708b-4bef-af2f-4363358430af-kube-api-access-f7w6r\") pod \"multus-d5h7c\" (UID: \"ed2a524a-708b-4bef-af2f-4363358430af\") " pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.718694 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.718452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmqwd\" (UniqueName: \"kubernetes.io/projected/99a65d6d-c433-41e5-a73d-38b4e9860935-kube-api-access-bmqwd\") pod \"tuned-xzl5j\" (UID: \"99a65d6d-c433-41e5-a73d-38b4e9860935\") " pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.719279 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.719219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w497f\" (UniqueName: \"kubernetes.io/projected/98bb1e21-3148-43ae-9b64-5d00f0aadc0d-kube-api-access-w497f\") pod \"multus-additional-cni-plugins-kvzsv\" (UID: \"98bb1e21-3148-43ae-9b64-5d00f0aadc0d\") " pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.720360 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.720338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgcx8\" (UniqueName: \"kubernetes.io/projected/e21bf7bf-cce9-4deb-8977-30b0f4341386-kube-api-access-pgcx8\") pod \"ovnkube-node-g7t7h\" (UID: \"e21bf7bf-cce9-4deb-8977-30b0f4341386\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.808993 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.808958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8dmq\" (UniqueName: \"kubernetes.io/projected/65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f-kube-api-access-x8dmq\") pod \"iptables-alerter-h8xhq\" (UID: \"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f\") " pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.809193 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-sys-fs\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809193 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-socket-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809193 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67znt\" (UniqueName: \"kubernetes.io/projected/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-kube-api-access-67znt\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809193 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-sys-fs\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809193 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-device-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809193 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-etc-selinux\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-device-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-socket-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f-host-slash\") pod \"iptables-alerter-h8xhq\" (UID: \"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f\") " pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f-host-slash\") pod \"iptables-alerter-h8xhq\" (UID: \"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f\") " pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809312 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grv77\" (UniqueName: \"kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77\") pod \"network-check-target-rvbk5\" (UID: \"b8122b7e-3e94-4772-bea0-462846dfdfab\") " pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809345 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f-iptables-alerter-script\") pod \"iptables-alerter-h8xhq\" (UID: \"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f\") " pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809407 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-registration-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-etc-selinux\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.809519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.809493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-registration-dir\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.810422 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.810403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f-iptables-alerter-script\") pod \"iptables-alerter-h8xhq\" (UID: \"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f\") " pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.815125 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:27.815100 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:05:27.815125 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:27.815126 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:05:27.815305 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:27.815138 2575 projected.go:194] Error preparing data for projected volume kube-api-access-grv77 for pod openshift-network-diagnostics/network-check-target-rvbk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:27.815305 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:27.815216 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77 podName:b8122b7e-3e94-4772-bea0-462846dfdfab nodeName:}" failed. No retries permitted until 2026-04-16 10:05:28.315197007 +0000 UTC m=+3.151800178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-grv77" (UniqueName: "kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77") pod "network-check-target-rvbk5" (UID: "b8122b7e-3e94-4772-bea0-462846dfdfab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:27.817483 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.817459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8dmq\" (UniqueName: \"kubernetes.io/projected/65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f-kube-api-access-x8dmq\") pod \"iptables-alerter-h8xhq\" (UID: \"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f\") " pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.817589 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.817514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67znt\" (UniqueName: \"kubernetes.io/projected/35bde40b-1b7c-437e-86f6-3c1a102c8bb0-kube-api-access-67znt\") pod \"aws-ebs-csi-driver-node-hn29x\" (UID: \"35bde40b-1b7c-437e-86f6-3c1a102c8bb0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.898404 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.898318 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gjdtj" Apr 16 10:05:27.908351 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.908324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" Apr 16 10:05:27.915118 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.915096 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" Apr 16 10:05:27.922668 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.922648 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5h7c" Apr 16 10:05:27.929233 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.929213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rv2dg" Apr 16 10:05:27.937869 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.937852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:27.944437 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.944417 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:27.953913 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.953895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" Apr 16 10:05:27.960419 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.960400 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h8xhq" Apr 16 10:05:27.970326 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:27.970309 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:05:28.211785 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.211710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:28.212091 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:28.212070 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:28.212205 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:28.212193 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs podName:1591f776-8015-497b-a4bf-80b359c62427 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:29.21216675 +0000 UTC m=+4.048769916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs") pod "network-metrics-daemon-8r8s4" (UID: "1591f776-8015-497b-a4bf-80b359c62427") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:28.406961 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:28.406932 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98bb1e21_3148_43ae_9b64_5d00f0aadc0d.slice/crio-9c0f1325ed3f2c876fb1e86966a7bf11744b18fb808465c0b7fa704794957e69 WatchSource:0}: Error finding container 9c0f1325ed3f2c876fb1e86966a7bf11744b18fb808465c0b7fa704794957e69: Status 404 returned error can't find the container with id 9c0f1325ed3f2c876fb1e86966a7bf11744b18fb808465c0b7fa704794957e69 Apr 16 10:05:28.412317 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:28.412283 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35bde40b_1b7c_437e_86f6_3c1a102c8bb0.slice/crio-789997c4941ff5efd1cf0519f8e320865628d7da8d8def041021cbef27e871f8 WatchSource:0}: Error finding container 789997c4941ff5efd1cf0519f8e320865628d7da8d8def041021cbef27e871f8: Status 404 returned error can't find the container with id 789997c4941ff5efd1cf0519f8e320865628d7da8d8def041021cbef27e871f8 Apr 16 10:05:28.412520 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.412491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grv77\" (UniqueName: \"kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77\") pod \"network-check-target-rvbk5\" (UID: \"b8122b7e-3e94-4772-bea0-462846dfdfab\") " pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:28.412664 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:28.412645 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:05:28.412664 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:28.412659 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:05:28.412756 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:28.412669 2575 projected.go:194] Error preparing data for projected volume kube-api-access-grv77 for pod openshift-network-diagnostics/network-check-target-rvbk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:28.412756 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:28.412722 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77 podName:b8122b7e-3e94-4772-bea0-462846dfdfab nodeName:}" failed. No retries permitted until 2026-04-16 10:05:29.412700822 +0000 UTC m=+4.249303975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-grv77" (UniqueName: "kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77") pod "network-check-target-rvbk5" (UID: "b8122b7e-3e94-4772-bea0-462846dfdfab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:28.415090 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:28.415055 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99a65d6d_c433_41e5_a73d_38b4e9860935.slice/crio-14daf66349ac8cefab1504ada5ede2c6490eabeab796ff70b040794138a0e745 WatchSource:0}: Error finding container 14daf66349ac8cefab1504ada5ede2c6490eabeab796ff70b040794138a0e745: Status 404 returned error can't find the container with id 14daf66349ac8cefab1504ada5ede2c6490eabeab796ff70b040794138a0e745 Apr 16 10:05:28.415829 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:28.415795 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f990235_90f2_4344_9fbc_4a60dac858ad.slice/crio-3eb8336153159215b1175d5e991f9060d712854bf6f7c32280858d8202f8d49a WatchSource:0}: Error finding container 3eb8336153159215b1175d5e991f9060d712854bf6f7c32280858d8202f8d49a: Status 404 returned error can't find the container with id 3eb8336153159215b1175d5e991f9060d712854bf6f7c32280858d8202f8d49a Apr 16 10:05:28.416768 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:28.416735 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a82420b_dfa7_4776_ada5_5f24f6e237d2.slice/crio-4252f4ac44c2510ee6085a27dacb654ec98a804631e737fc8d655bfe446c762a WatchSource:0}: Error finding container 4252f4ac44c2510ee6085a27dacb654ec98a804631e737fc8d655bfe446c762a: Status 404 returned error can't find the container with id 4252f4ac44c2510ee6085a27dacb654ec98a804631e737fc8d655bfe446c762a Apr 16 10:05:28.417483 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:28.417337 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf46483_1c9c_44d9_8737_763a361a473f.slice/crio-7ebc1418471e10665f64f10c74fe46dac79afc565af3d57e0051f9ae16875643 WatchSource:0}: Error finding container 7ebc1418471e10665f64f10c74fe46dac79afc565af3d57e0051f9ae16875643: Status 404 returned error can't find the container with id 7ebc1418471e10665f64f10c74fe46dac79afc565af3d57e0051f9ae16875643 Apr 16 10:05:28.419381 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:28.419355 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode21bf7bf_cce9_4deb_8977_30b0f4341386.slice/crio-e3e2dd9f8819e683b039adf200fcbcff0e7dcb0572492c9f5df76d8d2314a0ba WatchSource:0}: Error finding container e3e2dd9f8819e683b039adf200fcbcff0e7dcb0572492c9f5df76d8d2314a0ba: Status 404 returned error can't find the container with id e3e2dd9f8819e683b039adf200fcbcff0e7dcb0572492c9f5df76d8d2314a0ba Apr 16 10:05:28.419752 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:28.419724 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded2a524a_708b_4bef_af2f_4363358430af.slice/crio-11eece25fa9ab6da5611881fcc080e8b31a537fdae23030785a269529dad9ac7 WatchSource:0}: Error finding container 11eece25fa9ab6da5611881fcc080e8b31a537fdae23030785a269529dad9ac7: Status 404 returned error can't find the container with id 11eece25fa9ab6da5611881fcc080e8b31a537fdae23030785a269529dad9ac7 Apr 16 10:05:28.420580 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:28.420535 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65a7bc47_f0bb_4b71_acd4_41c6d3d7ea1f.slice/crio-013010cc1ad693d66edce1a61276a53afbf0748b32b11ce8d866d1d07837a845 WatchSource:0}: Error finding container 013010cc1ad693d66edce1a61276a53afbf0748b32b11ce8d866d1d07837a845: Status 404 returned error can't find the container with id 013010cc1ad693d66edce1a61276a53afbf0748b32b11ce8d866d1d07837a845 Apr 16 10:05:28.637674 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.637636 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 10:00:26 +0000 UTC" deadline="2027-10-27 12:46:29.817498933 +0000 UTC" Apr 16 10:05:28.637674 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.637668 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13418h41m1.179833544s" Apr 16 10:05:28.728888 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.728783 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:28.729036 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:28.728911 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:28.737992 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.737958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h8xhq" event={"ID":"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f","Type":"ContainerStarted","Data":"013010cc1ad693d66edce1a61276a53afbf0748b32b11ce8d866d1d07837a845"} Apr 16 10:05:28.739725 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.739698 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5h7c" event={"ID":"ed2a524a-708b-4bef-af2f-4363358430af","Type":"ContainerStarted","Data":"11eece25fa9ab6da5611881fcc080e8b31a537fdae23030785a269529dad9ac7"} Apr 16 10:05:28.742017 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.741974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cjccl" event={"ID":"5bf46483-1c9c-44d9-8737-763a361a473f","Type":"ContainerStarted","Data":"7ebc1418471e10665f64f10c74fe46dac79afc565af3d57e0051f9ae16875643"} Apr 16 10:05:28.742930 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.742909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rv2dg" event={"ID":"6f990235-90f2-4344-9fbc-4a60dac858ad","Type":"ContainerStarted","Data":"3eb8336153159215b1175d5e991f9060d712854bf6f7c32280858d8202f8d49a"} Apr 16 10:05:28.743759 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.743742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" event={"ID":"35bde40b-1b7c-437e-86f6-3c1a102c8bb0","Type":"ContainerStarted","Data":"789997c4941ff5efd1cf0519f8e320865628d7da8d8def041021cbef27e871f8"} Apr 16 10:05:28.744665 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.744644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" event={"ID":"99a65d6d-c433-41e5-a73d-38b4e9860935","Type":"ContainerStarted","Data":"14daf66349ac8cefab1504ada5ede2c6490eabeab796ff70b040794138a0e745"} Apr 16 10:05:28.745541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.745522 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" event={"ID":"e21bf7bf-cce9-4deb-8977-30b0f4341386","Type":"ContainerStarted","Data":"e3e2dd9f8819e683b039adf200fcbcff0e7dcb0572492c9f5df76d8d2314a0ba"} Apr 16 10:05:28.746781 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.746763 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gjdtj" event={"ID":"8a82420b-dfa7-4776-ada5-5f24f6e237d2","Type":"ContainerStarted","Data":"4252f4ac44c2510ee6085a27dacb654ec98a804631e737fc8d655bfe446c762a"} Apr 16 10:05:28.747615 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.747591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" event={"ID":"98bb1e21-3148-43ae-9b64-5d00f0aadc0d","Type":"ContainerStarted","Data":"9c0f1325ed3f2c876fb1e86966a7bf11744b18fb808465c0b7fa704794957e69"} Apr 16 10:05:28.748965 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.748947 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal" event={"ID":"42c5bf6a0ae41a4c6f004afe4db8cd52","Type":"ContainerStarted","Data":"cd66b7fb85403d09ed2d53fa6c9d49e1ee83b42a5ca51429db7a2ebae33f3eaa"} Apr 16 10:05:28.762881 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:28.762840 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-215.ec2.internal" podStartSLOduration=1.762828216 podStartE2EDuration="1.762828216s" podCreationTimestamp="2026-04-16 10:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:05:28.762413139 +0000 UTC m=+3.599016309" watchObservedRunningTime="2026-04-16 10:05:28.762828216 +0000 UTC m=+3.599431387" Apr 16 10:05:29.170089 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:29.169664 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:05:29.221578 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:29.220935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:29.221578 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:29.221107 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:29.221578 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:29.221189 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs podName:1591f776-8015-497b-a4bf-80b359c62427 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:31.221169559 +0000 UTC m=+6.057772723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs") pod "network-metrics-daemon-8r8s4" (UID: "1591f776-8015-497b-a4bf-80b359c62427") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:29.423466 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:29.422686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grv77\" (UniqueName: \"kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77\") pod \"network-check-target-rvbk5\" (UID: \"b8122b7e-3e94-4772-bea0-462846dfdfab\") " pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:29.423466 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:29.422843 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:05:29.423466 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:29.422864 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:05:29.423466 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:29.422880 2575 projected.go:194] Error preparing data for projected volume kube-api-access-grv77 for pod openshift-network-diagnostics/network-check-target-rvbk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:29.423466 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:29.422941 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77 podName:b8122b7e-3e94-4772-bea0-462846dfdfab nodeName:}" failed. No retries permitted until 2026-04-16 10:05:31.422922382 +0000 UTC m=+6.259525537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-grv77" (UniqueName: "kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77") pod "network-check-target-rvbk5" (UID: "b8122b7e-3e94-4772-bea0-462846dfdfab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:29.729752 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:29.729188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:29.729752 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:29.729316 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:29.762689 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:29.762614 2575 generic.go:358] "Generic (PLEG): container finished" podID="63ef4af2b99ce412233ebbc0edd9cbf1" containerID="bc5931d522dbef856137700de4c2a187d2db698a7cb17a8862d77b08b034893b" exitCode=0 Apr 16 10:05:29.763219 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:29.763195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" event={"ID":"63ef4af2b99ce412233ebbc0edd9cbf1","Type":"ContainerDied","Data":"bc5931d522dbef856137700de4c2a187d2db698a7cb17a8862d77b08b034893b"} Apr 16 10:05:30.728942 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:30.728904 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:30.729139 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:30.729051 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:30.770275 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:30.770235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" event={"ID":"63ef4af2b99ce412233ebbc0edd9cbf1","Type":"ContainerStarted","Data":"7be1793f977d6b36c12b32cbea628d57f21ed9fc0ae6c0bad1227bc4e7d32ece"} Apr 16 10:05:31.239183 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:31.239131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:31.239371 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:31.239349 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:31.239436 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:31.239420 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs podName:1591f776-8015-497b-a4bf-80b359c62427 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:35.239399907 +0000 UTC m=+10.076003060 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs") pod "network-metrics-daemon-8r8s4" (UID: "1591f776-8015-497b-a4bf-80b359c62427") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:31.441000 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:31.440936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grv77\" (UniqueName: \"kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77\") pod \"network-check-target-rvbk5\" (UID: \"b8122b7e-3e94-4772-bea0-462846dfdfab\") " pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:31.441236 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:31.441174 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:05:31.441236 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:31.441197 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:05:31.441236 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:31.441211 2575 projected.go:194] Error preparing data for projected volume kube-api-access-grv77 for pod openshift-network-diagnostics/network-check-target-rvbk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:31.441411 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:31.441270 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77 podName:b8122b7e-3e94-4772-bea0-462846dfdfab nodeName:}" failed. No retries permitted until 2026-04-16 10:05:35.441251788 +0000 UTC m=+10.277854953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-grv77" (UniqueName: "kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77") pod "network-check-target-rvbk5" (UID: "b8122b7e-3e94-4772-bea0-462846dfdfab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:31.729246 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:31.728905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:31.729246 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:31.729050 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:32.729006 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:32.728968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:32.729508 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:32.729135 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:33.728405 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:33.728371 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:33.728579 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:33.728501 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:34.728771 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:34.728739 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:34.729260 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:34.728874 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:35.275314 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:35.274651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:35.275314 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:35.274897 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:35.275314 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:35.274965 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs podName:1591f776-8015-497b-a4bf-80b359c62427 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:43.27494522 +0000 UTC m=+18.111548369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs") pod "network-metrics-daemon-8r8s4" (UID: "1591f776-8015-497b-a4bf-80b359c62427") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:35.476075 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:35.476032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grv77\" (UniqueName: \"kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77\") pod \"network-check-target-rvbk5\" (UID: \"b8122b7e-3e94-4772-bea0-462846dfdfab\") " pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:35.476288 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:35.476232 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:05:35.476288 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:35.476253 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:05:35.476288 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:35.476267 2575 projected.go:194] Error preparing data for projected volume kube-api-access-grv77 for pod openshift-network-diagnostics/network-check-target-rvbk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:35.476433 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:35.476331 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77 podName:b8122b7e-3e94-4772-bea0-462846dfdfab nodeName:}" failed. No retries permitted until 2026-04-16 10:05:43.476311942 +0000 UTC m=+18.312915106 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-grv77" (UniqueName: "kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77") pod "network-check-target-rvbk5" (UID: "b8122b7e-3e94-4772-bea0-462846dfdfab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:35.730492 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:35.730144 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:35.730492 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:35.730273 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:36.729060 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:36.728593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:36.729060 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:36.728739 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:37.729128 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:37.728823 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:37.729128 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:37.728949 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:38.728813 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:38.728770 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:38.729057 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:38.728903 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:39.729318 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:39.729283 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:39.729777 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:39.729403 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:40.729173 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:40.729127 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:40.729350 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:40.729259 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:41.729130 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:41.729093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:41.729331 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:41.729239 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:42.728509 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:42.728472 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:42.728975 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:42.728614 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:43.332000 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:43.331956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:43.332188 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:43.332103 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:43.332265 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:43.332188 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs podName:1591f776-8015-497b-a4bf-80b359c62427 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:59.332151423 +0000 UTC m=+34.168754576 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs") pod "network-metrics-daemon-8r8s4" (UID: "1591f776-8015-497b-a4bf-80b359c62427") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:43.533699 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:43.533662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grv77\" (UniqueName: \"kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77\") pod \"network-check-target-rvbk5\" (UID: \"b8122b7e-3e94-4772-bea0-462846dfdfab\") " pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:43.533897 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:43.533866 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:05:43.533965 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:43.533898 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:05:43.533965 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:43.533913 2575 projected.go:194] Error preparing data for projected volume kube-api-access-grv77 for pod openshift-network-diagnostics/network-check-target-rvbk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:43.534032 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:43.533977 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77 podName:b8122b7e-3e94-4772-bea0-462846dfdfab nodeName:}" failed. No retries permitted until 2026-04-16 10:05:59.53395764 +0000 UTC m=+34.370560805 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-grv77" (UniqueName: "kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77") pod "network-check-target-rvbk5" (UID: "b8122b7e-3e94-4772-bea0-462846dfdfab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:43.729001 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:43.728961 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:43.729358 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:43.729067 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:44.728904 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:44.728867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:44.729082 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:44.728998 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:45.730095 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:45.730037 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:45.730484 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:45.730149 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:45.804187 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:45.804041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cjccl" event={"ID":"5bf46483-1c9c-44d9-8737-763a361a473f","Type":"ContainerStarted","Data":"e6071e599d80ea756e927270a875de316379bbc2ab6762944c12647001c63309"} Apr 16 10:05:45.807698 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:45.807657 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rv2dg" event={"ID":"6f990235-90f2-4344-9fbc-4a60dac858ad","Type":"ContainerStarted","Data":"17aa421923bd2f9b6ecf8fe0c44f80f0bcc570e9b40d1887091776279d31e094"} Apr 16 10:05:45.811022 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:45.810941 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" event={"ID":"35bde40b-1b7c-437e-86f6-3c1a102c8bb0","Type":"ContainerStarted","Data":"b2f8d021edb5affa505593838074770d0f8cde53ef933e77d08a58caa9fbee11"} Apr 16 10:05:45.812412 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:45.812376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" event={"ID":"99a65d6d-c433-41e5-a73d-38b4e9860935","Type":"ContainerStarted","Data":"a543d23e817f9066997f6b72726f09c003d7ad8e64b8cb35a3d93764c43fb603"} Apr 16 10:05:45.817379 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:45.817347 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-215.ec2.internal" podStartSLOduration=19.817335655 podStartE2EDuration="19.817335655s" podCreationTimestamp="2026-04-16 10:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:05:30.786367807 +0000 UTC m=+5.622970980" watchObservedRunningTime="2026-04-16 10:05:45.817335655 +0000 UTC m=+20.653938826" Apr 16 10:05:45.817783 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:45.817743 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cjccl" podStartSLOduration=8.384730219 podStartE2EDuration="20.817729163s" podCreationTimestamp="2026-04-16 10:05:25 +0000 UTC" firstStartedPulling="2026-04-16 10:05:28.419836891 +0000 UTC m=+3.256440040" lastFinishedPulling="2026-04-16 10:05:40.852835825 +0000 UTC m=+15.689438984" observedRunningTime="2026-04-16 10:05:45.81762041 +0000 UTC m=+20.654223582" watchObservedRunningTime="2026-04-16 10:05:45.817729163 +0000 UTC m=+20.654332335" Apr 16 10:05:45.832642 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:45.832598 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xzl5j" podStartSLOduration=3.770833767 podStartE2EDuration="20.8325842s" podCreationTimestamp="2026-04-16 10:05:25 +0000 UTC" firstStartedPulling="2026-04-16 10:05:28.416863956 +0000 UTC m=+3.253467105" lastFinishedPulling="2026-04-16 10:05:45.478614377 +0000 UTC m=+20.315217538" observedRunningTime="2026-04-16 10:05:45.832114547 +0000 UTC m=+20.668717721" watchObservedRunningTime="2026-04-16 10:05:45.8325842 +0000 UTC m=+20.669187371" Apr 16 10:05:45.844854 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:45.844810 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rv2dg" podStartSLOduration=4.090268191 podStartE2EDuration="20.844794635s" podCreationTimestamp="2026-04-16 10:05:25 +0000 UTC" firstStartedPulling="2026-04-16 10:05:28.417668002 +0000 UTC m=+3.254271155" lastFinishedPulling="2026-04-16 10:05:45.172194445 +0000 UTC m=+20.008797599" observedRunningTime="2026-04-16 10:05:45.84427709 +0000 UTC m=+20.680880262" watchObservedRunningTime="2026-04-16 10:05:45.844794635 +0000 UTC m=+20.681397806" Apr 16 10:05:46.199945 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.199698 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:46.728930 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.728898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:46.729088 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:46.729006 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:46.816385 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.816327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5h7c" event={"ID":"ed2a524a-708b-4bef-af2f-4363358430af","Type":"ContainerStarted","Data":"27a95eeb672f517e27d84d2a4a2f5dfad98b2ac6a2b5e56291d9c5295675e61d"} Apr 16 10:05:46.819342 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.819307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" event={"ID":"e21bf7bf-cce9-4deb-8977-30b0f4341386","Type":"ContainerStarted","Data":"a403f6fdf2b1de6c4b1ba31fcaed1eb8ed8c26b306494ce73b0f7cc93da58975"} Apr 16 10:05:46.819466 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.819346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" event={"ID":"e21bf7bf-cce9-4deb-8977-30b0f4341386","Type":"ContainerStarted","Data":"777f16add94cc54b7e1c246ad734eda8900ae263d88a86538da5dfde435dee3d"} Apr 16 10:05:46.819466 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.819359 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" event={"ID":"e21bf7bf-cce9-4deb-8977-30b0f4341386","Type":"ContainerStarted","Data":"a933cd71e59ab8b24b065cb590fc7369a713170211f14b216816eb3a18f4b88a"} Apr 16 10:05:46.819466 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.819370 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" event={"ID":"e21bf7bf-cce9-4deb-8977-30b0f4341386","Type":"ContainerStarted","Data":"66ff209db2375bd92765801ad81555475d48581dc2c4303dcff19d189d7db3f2"} Apr 16 10:05:46.819466 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.819385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" event={"ID":"e21bf7bf-cce9-4deb-8977-30b0f4341386","Type":"ContainerStarted","Data":"06cf6de59e95d98cab01d7dba548cb334ab67f34dca0fbaf64f6283b39b9fc88"} Apr 16 10:05:46.819466 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.819397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" event={"ID":"e21bf7bf-cce9-4deb-8977-30b0f4341386","Type":"ContainerStarted","Data":"f508425b662dc6b1ccb6b9406af8b15ec9146cbe47a20f5452fc761dad02b137"} Apr 16 10:05:46.820674 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.820641 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gjdtj" event={"ID":"8a82420b-dfa7-4776-ada5-5f24f6e237d2","Type":"ContainerStarted","Data":"86aa61b0c0627623fe0d60f81bed3a66a9ffd8d7ec7b6b22bd8781710bcb42b2"} Apr 16 10:05:46.822865 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.822837 2575 generic.go:358] "Generic (PLEG): container finished" podID="98bb1e21-3148-43ae-9b64-5d00f0aadc0d" containerID="d927ea3b687954cf985f3cec00e376ce93483fb3269b102a64895ea89442441c" exitCode=0 Apr 16 10:05:46.823020 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.822994 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" event={"ID":"98bb1e21-3148-43ae-9b64-5d00f0aadc0d","Type":"ContainerDied","Data":"d927ea3b687954cf985f3cec00e376ce93483fb3269b102a64895ea89442441c"} Apr 16 10:05:46.834095 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.834044 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d5h7c" podStartSLOduration=4.736138227 podStartE2EDuration="21.834026814s" podCreationTimestamp="2026-04-16 10:05:25 +0000 UTC" firstStartedPulling="2026-04-16 10:05:28.421725388 +0000 UTC m=+3.258328537" lastFinishedPulling="2026-04-16 10:05:45.519613959 +0000 UTC m=+20.356217124" observedRunningTime="2026-04-16 10:05:46.833449629 +0000 UTC m=+21.670052803" watchObservedRunningTime="2026-04-16 10:05:46.834026814 +0000 UTC m=+21.670629985" Apr 16 10:05:46.846818 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.846777 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gjdtj" podStartSLOduration=4.78658432 podStartE2EDuration="21.846766178s" podCreationTimestamp="2026-04-16 10:05:25 +0000 UTC" firstStartedPulling="2026-04-16 10:05:28.418436418 +0000 UTC m=+3.255039567" lastFinishedPulling="2026-04-16 10:05:45.478618265 +0000 UTC m=+20.315221425" observedRunningTime="2026-04-16 10:05:46.846474919 +0000 UTC m=+21.683078101" watchObservedRunningTime="2026-04-16 10:05:46.846766178 +0000 UTC m=+21.683369348" Apr 16 10:05:46.978600 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:46.978577 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 10:05:47.657019 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:47.656898 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T10:05:46.978595613Z","UUID":"90e0e1c8-149b-42aa-9f7c-c3efd9cf9226","Handler":null,"Name":"","Endpoint":""} Apr 16 10:05:47.658777 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:47.658733 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 10:05:47.658777 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:47.658770 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 10:05:47.729331 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:47.729288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:47.729512 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:47.729411 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:47.826904 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:47.826774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h8xhq" event={"ID":"65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f","Type":"ContainerStarted","Data":"1dbd604a3a574a5931b0f814b3324a7c76e207d6c2b2879c05df082696022255"} Apr 16 10:05:47.828952 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:47.828921 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" event={"ID":"35bde40b-1b7c-437e-86f6-3c1a102c8bb0","Type":"ContainerStarted","Data":"5c25a7dc6659c5ee67be07b317f89d217944103c6dd243b3fda5c906f76e9ced"} Apr 16 10:05:47.841781 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:47.841722 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-h8xhq" podStartSLOduration=5.091990968 podStartE2EDuration="21.841703457s" podCreationTimestamp="2026-04-16 10:05:26 +0000 UTC" firstStartedPulling="2026-04-16 10:05:28.42248077 +0000 UTC m=+3.259083923" lastFinishedPulling="2026-04-16 10:05:45.172193262 +0000 UTC m=+20.008796412" observedRunningTime="2026-04-16 10:05:47.840936493 +0000 UTC m=+22.677539668" watchObservedRunningTime="2026-04-16 10:05:47.841703457 +0000 UTC m=+22.678306629" Apr 16 10:05:48.729011 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:48.728976 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:48.729140 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:48.729122 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:48.833632 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:48.833595 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" event={"ID":"35bde40b-1b7c-437e-86f6-3c1a102c8bb0","Type":"ContainerStarted","Data":"597de3fdabd056074a5d1c457c2b0689ae1dfc3c45c3cf208af0ce0dcf9f5f1c"} Apr 16 10:05:48.837549 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:48.837509 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" event={"ID":"e21bf7bf-cce9-4deb-8977-30b0f4341386","Type":"ContainerStarted","Data":"914817ca6eae39dbdb1e21805b5bc1d03586824c20ac062163c3bbb5b5d912a1"} Apr 16 10:05:48.851150 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:48.851091 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hn29x" podStartSLOduration=3.129930586 podStartE2EDuration="22.851071871s" podCreationTimestamp="2026-04-16 10:05:26 +0000 UTC" firstStartedPulling="2026-04-16 10:05:28.41396647 +0000 UTC m=+3.250569618" lastFinishedPulling="2026-04-16 10:05:48.135107747 +0000 UTC m=+22.971710903" observedRunningTime="2026-04-16 10:05:48.850543188 +0000 UTC m=+23.687146358" watchObservedRunningTime="2026-04-16 10:05:48.851071871 +0000 UTC m=+23.687675043" Apr 16 10:05:49.728915 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:49.728877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:49.729069 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:49.729012 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:50.728470 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:50.728431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:50.728866 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:50.728572 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:50.775235 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:50.775197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:50.775878 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:50.775850 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:50.841217 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:50.841189 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cjccl" Apr 16 10:05:51.728469 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:51.728293 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:51.729311 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:51.728564 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:51.845370 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:51.845334 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" event={"ID":"e21bf7bf-cce9-4deb-8977-30b0f4341386","Type":"ContainerStarted","Data":"a63be882552ead02f9f10ba9c40b7d0b4c9da5e60c7068b031686973d3515d81"} Apr 16 10:05:51.845723 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:51.845693 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:51.845723 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:51.845722 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:51.846985 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:51.846956 2575 generic.go:358] "Generic (PLEG): container finished" podID="98bb1e21-3148-43ae-9b64-5d00f0aadc0d" containerID="466510e6dcf0d6036e9e9c4c034d225d25e08ef26bfba161f532577769714d6b" exitCode=0 Apr 16 10:05:51.847095 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:51.847037 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" event={"ID":"98bb1e21-3148-43ae-9b64-5d00f0aadc0d","Type":"ContainerDied","Data":"466510e6dcf0d6036e9e9c4c034d225d25e08ef26bfba161f532577769714d6b"} Apr 16 10:05:51.860336 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:51.860311 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:51.864535 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:51.864515 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:51.873021 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:51.872981 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" podStartSLOduration=9.728300556 podStartE2EDuration="26.872968975s" podCreationTimestamp="2026-04-16 10:05:25 +0000 UTC" firstStartedPulling="2026-04-16 10:05:28.421324061 +0000 UTC m=+3.257927210" lastFinishedPulling="2026-04-16 10:05:45.565992464 +0000 UTC m=+20.402595629" observedRunningTime="2026-04-16 10:05:51.872683528 +0000 UTC m=+26.709286699" watchObservedRunningTime="2026-04-16 10:05:51.872968975 +0000 UTC m=+26.709572162" Apr 16 10:05:52.728744 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:52.728635 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:52.728744 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:52.728739 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:52.851641 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:52.851598 2575 generic.go:358] "Generic (PLEG): container finished" podID="98bb1e21-3148-43ae-9b64-5d00f0aadc0d" containerID="20b58fe5455e5613e78b3cbc02f31b2cb5323a2ed3598a66a4ee338a9d334536" exitCode=0 Apr 16 10:05:52.851828 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:52.851812 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 10:05:52.851985 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:52.851926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" event={"ID":"98bb1e21-3148-43ae-9b64-5d00f0aadc0d","Type":"ContainerDied","Data":"20b58fe5455e5613e78b3cbc02f31b2cb5323a2ed3598a66a4ee338a9d334536"} Apr 16 10:05:52.853394 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:52.853330 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8r8s4"] Apr 16 10:05:52.853472 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:52.853416 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:52.853561 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:52.853536 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:52.855804 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:52.855782 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rvbk5"] Apr 16 10:05:52.855915 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:52.855862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:52.856003 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:52.855981 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:53.855999 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:53.855963 2575 generic.go:358] "Generic (PLEG): container finished" podID="98bb1e21-3148-43ae-9b64-5d00f0aadc0d" containerID="ee6ac69275c6a517628427701f1ad5e8df7d1ca284764f867b90e2227f7f1a42" exitCode=0 Apr 16 10:05:53.856486 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:53.856059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" event={"ID":"98bb1e21-3148-43ae-9b64-5d00f0aadc0d","Type":"ContainerDied","Data":"ee6ac69275c6a517628427701f1ad5e8df7d1ca284764f867b90e2227f7f1a42"} Apr 16 10:05:53.856486 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:53.856225 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 10:05:54.728699 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:54.728665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:54.728894 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:54.728786 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:54.728894 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:54.728848 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:54.729008 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:54.728958 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:55.513228 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:55.513124 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:05:55.513694 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:55.513377 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 10:05:55.526321 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:55.526114 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" podUID="e21bf7bf-cce9-4deb-8977-30b0f4341386" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 10:05:55.536968 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:55.536927 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" podUID="e21bf7bf-cce9-4deb-8977-30b0f4341386" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 10:05:56.728639 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:56.728600 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:56.729068 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:56.728654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:56.729068 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:56.728738 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8r8s4" podUID="1591f776-8015-497b-a4bf-80b359c62427" Apr 16 10:05:56.729068 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:56.728868 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rvbk5" podUID="b8122b7e-3e94-4772-bea0-462846dfdfab" Apr 16 10:05:58.504269 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.504238 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeReady" Apr 16 10:05:58.504692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.504401 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 10:05:58.547685 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.547564 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qjsr9"] Apr 16 10:05:58.552737 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.552708 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2lw46"] Apr 16 10:05:58.552907 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.552888 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.555222 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.555192 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 10:05:58.555222 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.555200 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ccsn6\"" Apr 16 10:05:58.555969 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.555952 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 10:05:58.556568 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.556547 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:05:58.559341 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.559317 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 10:05:58.559438 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.559345 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 10:05:58.560334 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.560113 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 10:05:58.560650 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.560628 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qjsr9"] Apr 16 10:05:58.563179 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.563139 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2lw46"] Apr 16 10:05:58.564243 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.564221 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cwtl6\"" Apr 16 10:05:58.638787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.638749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2b517951-6607-404b-bc0f-a66ec956499a-tmp-dir\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.638972 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.638798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctj9c\" (UniqueName: \"kubernetes.io/projected/2b517951-6607-404b-bc0f-a66ec956499a-kube-api-access-ctj9c\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.638972 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.638828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:05:58.638972 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.638911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b517951-6607-404b-bc0f-a66ec956499a-config-volume\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.638972 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.638960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.639187 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.638994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn7c5\" (UniqueName: \"kubernetes.io/projected/44072043-e38d-468a-ae02-9082c94f67cc-kube-api-access-nn7c5\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:05:58.728418 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.728355 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:58.728609 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.728380 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:58.731338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.731311 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 10:05:58.731338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.731320 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 10:05:58.731529 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.731320 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mvsr8\"" Apr 16 10:05:58.731529 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.731400 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-btvgh\"" Apr 16 10:05:58.731529 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.731317 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 10:05:58.739591 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.739568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2b517951-6607-404b-bc0f-a66ec956499a-tmp-dir\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.739719 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.739601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctj9c\" (UniqueName: \"kubernetes.io/projected/2b517951-6607-404b-bc0f-a66ec956499a-kube-api-access-ctj9c\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.739719 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.739627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:05:58.739719 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:58.739713 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:58.739880 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:58.739776 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert podName:44072043-e38d-468a-ae02-9082c94f67cc nodeName:}" failed. No retries permitted until 2026-04-16 10:05:59.239757276 +0000 UTC m=+34.076360434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert") pod "ingress-canary-2lw46" (UID: "44072043-e38d-468a-ae02-9082c94f67cc") : secret "canary-serving-cert" not found Apr 16 10:05:58.739939 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.739880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b517951-6607-404b-bc0f-a66ec956499a-config-volume\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.739939 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.739930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.740034 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.739936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2b517951-6607-404b-bc0f-a66ec956499a-tmp-dir\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.740034 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.739987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn7c5\" (UniqueName: \"kubernetes.io/projected/44072043-e38d-468a-ae02-9082c94f67cc-kube-api-access-nn7c5\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:05:58.740034 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:58.740024 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:58.740480 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:58.740100 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls podName:2b517951-6607-404b-bc0f-a66ec956499a nodeName:}" failed. No retries permitted until 2026-04-16 10:05:59.240082914 +0000 UTC m=+34.076686077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls") pod "dns-default-qjsr9" (UID: "2b517951-6607-404b-bc0f-a66ec956499a") : secret "dns-default-metrics-tls" not found Apr 16 10:05:58.740480 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.740431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b517951-6607-404b-bc0f-a66ec956499a-config-volume\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:58.750073 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.749951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn7c5\" (UniqueName: \"kubernetes.io/projected/44072043-e38d-468a-ae02-9082c94f67cc-kube-api-access-nn7c5\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:05:58.750523 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:58.750507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctj9c\" (UniqueName: \"kubernetes.io/projected/2b517951-6607-404b-bc0f-a66ec956499a-kube-api-access-ctj9c\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:59.243174 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:59.243110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:05:59.243371 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:59.243266 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:05:59.243371 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:59.243286 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:59.243499 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:59.243374 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert podName:44072043-e38d-468a-ae02-9082c94f67cc nodeName:}" failed. No retries permitted until 2026-04-16 10:06:00.243352435 +0000 UTC m=+35.079955589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert") pod "ingress-canary-2lw46" (UID: "44072043-e38d-468a-ae02-9082c94f67cc") : secret "canary-serving-cert" not found Apr 16 10:05:59.243499 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:59.243414 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:59.243499 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:59.243466 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls podName:2b517951-6607-404b-bc0f-a66ec956499a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:00.243448614 +0000 UTC m=+35.080051805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls") pod "dns-default-qjsr9" (UID: "2b517951-6607-404b-bc0f-a66ec956499a") : secret "dns-default-metrics-tls" not found Apr 16 10:05:59.344233 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:59.344194 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:05:59.344404 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:59.344361 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 10:05:59.344490 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:05:59.344443 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs podName:1591f776-8015-497b-a4bf-80b359c62427 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:31.344422334 +0000 UTC m=+66.181025483 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs") pod "network-metrics-daemon-8r8s4" (UID: "1591f776-8015-497b-a4bf-80b359c62427") : secret "metrics-daemon-secret" not found Apr 16 10:05:59.545758 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:59.545666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grv77\" (UniqueName: \"kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77\") pod \"network-check-target-rvbk5\" (UID: \"b8122b7e-3e94-4772-bea0-462846dfdfab\") " pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:59.548559 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:59.548531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grv77\" (UniqueName: \"kubernetes.io/projected/b8122b7e-3e94-4772-bea0-462846dfdfab-kube-api-access-grv77\") pod \"network-check-target-rvbk5\" (UID: \"b8122b7e-3e94-4772-bea0-462846dfdfab\") " pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:59.640394 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:59.640356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:05:59.899628 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:05:59.899597 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rvbk5"] Apr 16 10:05:59.903120 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:05:59.903095 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8122b7e_3e94_4772_bea0_462846dfdfab.slice/crio-efbd87cf1fbdcce9e87e880eeba1d88e5af5bd8ed31aac04c4ada07de0518229 WatchSource:0}: Error finding container efbd87cf1fbdcce9e87e880eeba1d88e5af5bd8ed31aac04c4ada07de0518229: Status 404 returned error can't find the container with id efbd87cf1fbdcce9e87e880eeba1d88e5af5bd8ed31aac04c4ada07de0518229 Apr 16 10:06:00.253209 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:00.252993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:06:00.253395 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:00.253255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:06:00.253395 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:00.253139 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:06:00.253395 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:00.253330 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:06:00.253395 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:00.253336 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls podName:2b517951-6607-404b-bc0f-a66ec956499a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:02.253318741 +0000 UTC m=+37.089921893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls") pod "dns-default-qjsr9" (UID: "2b517951-6607-404b-bc0f-a66ec956499a") : secret "dns-default-metrics-tls" not found Apr 16 10:06:00.253395 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:00.253367 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert podName:44072043-e38d-468a-ae02-9082c94f67cc nodeName:}" failed. No retries permitted until 2026-04-16 10:06:02.253356127 +0000 UTC m=+37.089959276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert") pod "ingress-canary-2lw46" (UID: "44072043-e38d-468a-ae02-9082c94f67cc") : secret "canary-serving-cert" not found Apr 16 10:06:00.873907 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:00.873870 2575 generic.go:358] "Generic (PLEG): container finished" podID="98bb1e21-3148-43ae-9b64-5d00f0aadc0d" containerID="9b5ea223b081c69aee53a0791aedcc4bf231e1c5bd5b36b62d34e1c0d78b1878" exitCode=0 Apr 16 10:06:00.874798 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:00.873955 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" event={"ID":"98bb1e21-3148-43ae-9b64-5d00f0aadc0d","Type":"ContainerDied","Data":"9b5ea223b081c69aee53a0791aedcc4bf231e1c5bd5b36b62d34e1c0d78b1878"} Apr 16 10:06:00.875483 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:00.875418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rvbk5" event={"ID":"b8122b7e-3e94-4772-bea0-462846dfdfab","Type":"ContainerStarted","Data":"efbd87cf1fbdcce9e87e880eeba1d88e5af5bd8ed31aac04c4ada07de0518229"} Apr 16 10:06:01.881446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:01.881414 2575 generic.go:358] "Generic (PLEG): container finished" podID="98bb1e21-3148-43ae-9b64-5d00f0aadc0d" containerID="345fd6cc2e4cfb7ed537661c32ec433b34592e23be4b620a11a871036c5ce2a8" exitCode=0 Apr 16 10:06:01.881950 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:01.881471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" event={"ID":"98bb1e21-3148-43ae-9b64-5d00f0aadc0d","Type":"ContainerDied","Data":"345fd6cc2e4cfb7ed537661c32ec433b34592e23be4b620a11a871036c5ce2a8"} Apr 16 10:06:02.270392 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:02.270303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:06:02.270392 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:02.270354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:06:02.270591 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:02.270488 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:06:02.270591 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:02.270558 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls podName:2b517951-6607-404b-bc0f-a66ec956499a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:06.270535855 +0000 UTC m=+41.107139028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls") pod "dns-default-qjsr9" (UID: "2b517951-6607-404b-bc0f-a66ec956499a") : secret "dns-default-metrics-tls" not found Apr 16 10:06:02.270591 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:02.270488 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:06:02.270717 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:02.270644 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert podName:44072043-e38d-468a-ae02-9082c94f67cc nodeName:}" failed. No retries permitted until 2026-04-16 10:06:06.270624715 +0000 UTC m=+41.107227868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert") pod "ingress-canary-2lw46" (UID: "44072043-e38d-468a-ae02-9082c94f67cc") : secret "canary-serving-cert" not found Apr 16 10:06:02.887106 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:02.887025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" event={"ID":"98bb1e21-3148-43ae-9b64-5d00f0aadc0d","Type":"ContainerStarted","Data":"2c8634c68f34bf5817a5ee10970eaaffc634e84a871ba3eee17d94eaa161213a"} Apr 16 10:06:02.910445 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:02.910403 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kvzsv" podStartSLOduration=6.57039399 podStartE2EDuration="37.910386627s" podCreationTimestamp="2026-04-16 10:05:25 +0000 UTC" firstStartedPulling="2026-04-16 10:05:28.410445941 +0000 UTC m=+3.247049092" lastFinishedPulling="2026-04-16 10:05:59.750438579 +0000 UTC m=+34.587041729" observedRunningTime="2026-04-16 10:06:02.908664833 +0000 UTC m=+37.745268005" watchObservedRunningTime="2026-04-16 10:06:02.910386627 +0000 UTC m=+37.746989797" Apr 16 10:06:03.890569 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:03.890526 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rvbk5" event={"ID":"b8122b7e-3e94-4772-bea0-462846dfdfab","Type":"ContainerStarted","Data":"63808ed75494ce027cb48fc67169a2dd5171ef02c88fb45ee3447f851e4a0b03"} Apr 16 10:06:03.890958 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:03.890653 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:06:03.905131 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:03.905079 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rvbk5" podStartSLOduration=34.901468335 podStartE2EDuration="37.905065253s" podCreationTimestamp="2026-04-16 10:05:26 +0000 UTC" firstStartedPulling="2026-04-16 10:05:59.905223243 +0000 UTC m=+34.741826391" lastFinishedPulling="2026-04-16 10:06:02.908820147 +0000 UTC m=+37.745423309" observedRunningTime="2026-04-16 10:06:03.904885283 +0000 UTC m=+38.741488454" watchObservedRunningTime="2026-04-16 10:06:03.905065253 +0000 UTC m=+38.741668454" Apr 16 10:06:06.296507 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:06.296457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:06:06.296507 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:06.296510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:06:06.296933 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:06.296626 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:06:06.296933 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:06.296632 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:06:06.296933 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:06.296681 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls podName:2b517951-6607-404b-bc0f-a66ec956499a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:14.296665813 +0000 UTC m=+49.133268963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls") pod "dns-default-qjsr9" (UID: "2b517951-6607-404b-bc0f-a66ec956499a") : secret "dns-default-metrics-tls" not found Apr 16 10:06:06.296933 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:06.296695 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert podName:44072043-e38d-468a-ae02-9082c94f67cc nodeName:}" failed. No retries permitted until 2026-04-16 10:06:14.296688973 +0000 UTC m=+49.133292122 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert") pod "ingress-canary-2lw46" (UID: "44072043-e38d-468a-ae02-9082c94f67cc") : secret "canary-serving-cert" not found Apr 16 10:06:14.348971 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:14.348928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:06:14.348971 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:14.348972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:06:14.349561 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:14.349094 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:06:14.349561 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:14.349174 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert podName:44072043-e38d-468a-ae02-9082c94f67cc nodeName:}" failed. No retries permitted until 2026-04-16 10:06:30.349139514 +0000 UTC m=+65.185742672 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert") pod "ingress-canary-2lw46" (UID: "44072043-e38d-468a-ae02-9082c94f67cc") : secret "canary-serving-cert" not found Apr 16 10:06:14.349561 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:14.349094 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:06:14.349561 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:14.349220 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls podName:2b517951-6607-404b-bc0f-a66ec956499a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:30.349209119 +0000 UTC m=+65.185812268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls") pod "dns-default-qjsr9" (UID: "2b517951-6607-404b-bc0f-a66ec956499a") : secret "dns-default-metrics-tls" not found Apr 16 10:06:25.535902 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:25.535872 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g7t7h" Apr 16 10:06:30.356839 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:30.356796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:06:30.356839 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:30.356841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:06:30.357305 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:30.356940 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:06:30.357305 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:30.356943 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:06:30.357305 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:30.356997 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert podName:44072043-e38d-468a-ae02-9082c94f67cc nodeName:}" failed. No retries permitted until 2026-04-16 10:07:02.35698231 +0000 UTC m=+97.193585464 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert") pod "ingress-canary-2lw46" (UID: "44072043-e38d-468a-ae02-9082c94f67cc") : secret "canary-serving-cert" not found Apr 16 10:06:30.357305 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:30.357011 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls podName:2b517951-6607-404b-bc0f-a66ec956499a nodeName:}" failed. No retries permitted until 2026-04-16 10:07:02.357005259 +0000 UTC m=+97.193608408 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls") pod "dns-default-qjsr9" (UID: "2b517951-6607-404b-bc0f-a66ec956499a") : secret "dns-default-metrics-tls" not found Apr 16 10:06:31.364787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:31.364742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:06:31.365206 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:31.364891 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 10:06:31.365206 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:31.364958 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs podName:1591f776-8015-497b-a4bf-80b359c62427 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:35.364941081 +0000 UTC m=+130.201544230 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs") pod "network-metrics-daemon-8r8s4" (UID: "1591f776-8015-497b-a4bf-80b359c62427") : secret "metrics-daemon-secret" not found Apr 16 10:06:34.895529 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:34.895500 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rvbk5" Apr 16 10:06:47.332336 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.332190 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr"] Apr 16 10:06:47.335028 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.335007 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-fcg2p"] Apr 16 10:06:47.335184 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.335148 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr" Apr 16 10:06:47.336529 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.336510 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.337472 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.337452 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-7q6kg\"" Apr 16 10:06:47.337472 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.337464 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 10:06:47.337616 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.337594 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:06:47.338528 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.338515 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 10:06:47.338887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.338871 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 10:06:47.339021 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.338993 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 10:06:47.339105 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.339067 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 10:06:47.339323 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.339308 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-sjqhx\"" Apr 16 10:06:47.341949 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.341929 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr"] Apr 16 10:06:47.343997 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.343553 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 10:06:47.344584 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.344566 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-fcg2p"] Apr 16 10:06:47.379197 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.379148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-tmp\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.379197 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.379196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-serving-cert\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.379402 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.379232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.379402 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.379253 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjw6k\" (UniqueName: \"kubernetes.io/projected/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-kube-api-access-zjw6k\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.379402 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.379273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-snapshots\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.379402 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.379312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.379402 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.379342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qff9q\" (UniqueName: \"kubernetes.io/projected/e13bd7e9-dbd9-408e-bd11-e92d36fb4d01-kube-api-access-qff9q\") pod \"volume-data-source-validator-7d955d5dd4-7b6rr\" (UID: \"e13bd7e9-dbd9-408e-bd11-e92d36fb4d01\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr" Apr 16 10:06:47.438573 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.438546 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-8677bf5f94-ttzcm"] Apr 16 10:06:47.440949 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.440934 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.443331 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.443296 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 10:06:47.443331 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.443326 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 10:06:47.443541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.443336 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 10:06:47.443541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.443419 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 10:06:47.443656 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.443601 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 10:06:47.443713 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.443667 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-l6fb4\"" Apr 16 10:06:47.443713 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.443707 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 10:06:47.452731 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.452709 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-8677bf5f94-ttzcm"] Apr 16 10:06:47.480453 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-tmp\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.480453 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-serving-cert\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.480647 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.480689 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-stats-auth\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.480736 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.480785 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjw6k\" (UniqueName: \"kubernetes.io/projected/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-kube-api-access-zjw6k\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.480785 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6txg\" (UniqueName: \"kubernetes.io/projected/029997ce-820e-46fb-9d03-11be41d65ce4-kube-api-access-k6txg\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.480785 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-snapshots\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.480929 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.480929 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480830 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-default-certificate\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.480929 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-tmp\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.480929 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.481126 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.480938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qff9q\" (UniqueName: \"kubernetes.io/projected/e13bd7e9-dbd9-408e-bd11-e92d36fb4d01-kube-api-access-qff9q\") pod \"volume-data-source-validator-7d955d5dd4-7b6rr\" (UID: \"e13bd7e9-dbd9-408e-bd11-e92d36fb4d01\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr" Apr 16 10:06:47.481346 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.481325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.481346 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.481341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-snapshots\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.481749 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.481733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.484074 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.484054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-serving-cert\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.487851 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.487827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qff9q\" (UniqueName: \"kubernetes.io/projected/e13bd7e9-dbd9-408e-bd11-e92d36fb4d01-kube-api-access-qff9q\") pod \"volume-data-source-validator-7d955d5dd4-7b6rr\" (UID: \"e13bd7e9-dbd9-408e-bd11-e92d36fb4d01\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr" Apr 16 10:06:47.487952 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.487869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjw6k\" (UniqueName: \"kubernetes.io/projected/d7b584f5-67ae-4ab7-9eb5-4bd14248b512-kube-api-access-zjw6k\") pod \"insights-operator-5785d4fcdd-fcg2p\" (UID: \"d7b584f5-67ae-4ab7-9eb5-4bd14248b512\") " pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.545538 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.545504 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2"] Apr 16 10:06:47.548012 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.547996 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq"] Apr 16 10:06:47.548448 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.548424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:47.551009 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.550984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.551532 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.551289 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-8rktf\"" Apr 16 10:06:47.551532 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.551344 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 10:06:47.551985 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.551733 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:06:47.553208 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.553193 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 10:06:47.553284 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.553194 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 10:06:47.553391 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.553377 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 10:06:47.553710 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.553696 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-gj6ds\"" Apr 16 10:06:47.553785 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.553713 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:06:47.553848 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.553832 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 10:06:47.559726 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.559704 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2"] Apr 16 10:06:47.563865 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.563847 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq"] Apr 16 10:06:47.581505 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.581650 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-default-certificate\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.581650 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581553 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:47.581650 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cr2k\" (UniqueName: \"kubernetes.io/projected/d8417f60-7ed8-4533-89f7-b627ef6d286a-kube-api-access-4cr2k\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:47.581650 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e1c01b-c627-492b-b514-d6583deef22d-serving-cert\") pod \"service-ca-operator-69965bb79d-dbqbq\" (UID: \"24e1c01b-c627-492b-b514-d6583deef22d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.581650 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:47.581614 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 10:06:47.581887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e1c01b-c627-492b-b514-d6583deef22d-config\") pod \"service-ca-operator-69965bb79d-dbqbq\" (UID: \"24e1c01b-c627-492b-b514-d6583deef22d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.581887 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:47.581706 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:48.081682963 +0000 UTC m=+82.918286118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : secret "router-metrics-certs-default" not found Apr 16 10:06:47.581887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.581887 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-stats-auth\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.582055 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6txg\" (UniqueName: \"kubernetes.io/projected/029997ce-820e-46fb-9d03-11be41d65ce4-kube-api-access-k6txg\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.582055 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.581937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbkt6\" (UniqueName: \"kubernetes.io/projected/24e1c01b-c627-492b-b514-d6583deef22d-kube-api-access-tbkt6\") pod \"service-ca-operator-69965bb79d-dbqbq\" (UID: \"24e1c01b-c627-492b-b514-d6583deef22d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.582055 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:47.582000 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:48.08198202 +0000 UTC m=+82.918585170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : configmap references non-existent config key: service-ca.crt Apr 16 10:06:47.584026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.583973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-default-certificate\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.584109 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.584056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-stats-auth\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.598779 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.598751 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6txg\" (UniqueName: \"kubernetes.io/projected/029997ce-820e-46fb-9d03-11be41d65ce4-kube-api-access-k6txg\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:47.646532 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.646485 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr" Apr 16 10:06:47.651759 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.651737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" Apr 16 10:06:47.682907 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.682874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbkt6\" (UniqueName: \"kubernetes.io/projected/24e1c01b-c627-492b-b514-d6583deef22d-kube-api-access-tbkt6\") pod \"service-ca-operator-69965bb79d-dbqbq\" (UID: \"24e1c01b-c627-492b-b514-d6583deef22d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.683071 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.682925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:47.683071 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.682943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cr2k\" (UniqueName: \"kubernetes.io/projected/d8417f60-7ed8-4533-89f7-b627ef6d286a-kube-api-access-4cr2k\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:47.683071 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.682958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e1c01b-c627-492b-b514-d6583deef22d-serving-cert\") pod \"service-ca-operator-69965bb79d-dbqbq\" (UID: \"24e1c01b-c627-492b-b514-d6583deef22d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.683071 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.682978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e1c01b-c627-492b-b514-d6583deef22d-config\") pod \"service-ca-operator-69965bb79d-dbqbq\" (UID: \"24e1c01b-c627-492b-b514-d6583deef22d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.683549 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:47.683376 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 10:06:47.683549 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:47.683455 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls podName:d8417f60-7ed8-4533-89f7-b627ef6d286a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:48.183436632 +0000 UTC m=+83.020039785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6rzg2" (UID: "d8417f60-7ed8-4533-89f7-b627ef6d286a") : secret "samples-operator-tls" not found Apr 16 10:06:47.683549 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.683483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e1c01b-c627-492b-b514-d6583deef22d-config\") pod \"service-ca-operator-69965bb79d-dbqbq\" (UID: \"24e1c01b-c627-492b-b514-d6583deef22d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.685944 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.685876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e1c01b-c627-492b-b514-d6583deef22d-serving-cert\") pod \"service-ca-operator-69965bb79d-dbqbq\" (UID: \"24e1c01b-c627-492b-b514-d6583deef22d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.691913 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.691887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbkt6\" (UniqueName: \"kubernetes.io/projected/24e1c01b-c627-492b-b514-d6583deef22d-kube-api-access-tbkt6\") pod \"service-ca-operator-69965bb79d-dbqbq\" (UID: \"24e1c01b-c627-492b-b514-d6583deef22d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.692174 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.692129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cr2k\" (UniqueName: \"kubernetes.io/projected/d8417f60-7ed8-4533-89f7-b627ef6d286a-kube-api-access-4cr2k\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:47.767662 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.767628 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr"] Apr 16 10:06:47.771398 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:06:47.771371 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13bd7e9_dbd9_408e_bd11_e92d36fb4d01.slice/crio-a536cc33f26a9c48286fe3deb0f34aabb8afdc7e4c33270cc7ce2a5d3522994a WatchSource:0}: Error finding container a536cc33f26a9c48286fe3deb0f34aabb8afdc7e4c33270cc7ce2a5d3522994a: Status 404 returned error can't find the container with id a536cc33f26a9c48286fe3deb0f34aabb8afdc7e4c33270cc7ce2a5d3522994a Apr 16 10:06:47.785793 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.785768 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-fcg2p"] Apr 16 10:06:47.789047 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:06:47.789004 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b584f5_67ae_4ab7_9eb5_4bd14248b512.slice/crio-b98f88c4f3a79a92b0c9811cde2a286306d83ad6a00335dc28ec2b9ebbca948f WatchSource:0}: Error finding container b98f88c4f3a79a92b0c9811cde2a286306d83ad6a00335dc28ec2b9ebbca948f: Status 404 returned error can't find the container with id b98f88c4f3a79a92b0c9811cde2a286306d83ad6a00335dc28ec2b9ebbca948f Apr 16 10:06:47.869862 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.869763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" Apr 16 10:06:47.976033 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.975997 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" event={"ID":"d7b584f5-67ae-4ab7-9eb5-4bd14248b512","Type":"ContainerStarted","Data":"b98f88c4f3a79a92b0c9811cde2a286306d83ad6a00335dc28ec2b9ebbca948f"} Apr 16 10:06:47.976943 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.976917 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr" event={"ID":"e13bd7e9-dbd9-408e-bd11-e92d36fb4d01","Type":"ContainerStarted","Data":"a536cc33f26a9c48286fe3deb0f34aabb8afdc7e4c33270cc7ce2a5d3522994a"} Apr 16 10:06:47.982800 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:47.982774 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq"] Apr 16 10:06:47.985782 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:06:47.985752 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e1c01b_c627_492b_b514_d6583deef22d.slice/crio-233d94d1b3d16f6966773a68df8999aaeb7e6ecb5f7a3fb256da86d5269dd9a1 WatchSource:0}: Error finding container 233d94d1b3d16f6966773a68df8999aaeb7e6ecb5f7a3fb256da86d5269dd9a1: Status 404 returned error can't find the container with id 233d94d1b3d16f6966773a68df8999aaeb7e6ecb5f7a3fb256da86d5269dd9a1 Apr 16 10:06:48.086125 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:48.086085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:48.086125 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:48.086131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:48.086356 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:48.086269 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 10:06:48.086356 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:48.086288 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:49.086270409 +0000 UTC m=+83.922873557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : configmap references non-existent config key: service-ca.crt Apr 16 10:06:48.086356 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:48.086316 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:49.086307896 +0000 UTC m=+83.922911053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : secret "router-metrics-certs-default" not found Apr 16 10:06:48.187236 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:48.187103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:48.187405 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:48.187260 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 10:06:48.187405 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:48.187325 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls podName:d8417f60-7ed8-4533-89f7-b627ef6d286a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:49.187312706 +0000 UTC m=+84.023915855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6rzg2" (UID: "d8417f60-7ed8-4533-89f7-b627ef6d286a") : secret "samples-operator-tls" not found Apr 16 10:06:48.981022 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:48.980979 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" event={"ID":"24e1c01b-c627-492b-b514-d6583deef22d","Type":"ContainerStarted","Data":"233d94d1b3d16f6966773a68df8999aaeb7e6ecb5f7a3fb256da86d5269dd9a1"} Apr 16 10:06:49.095584 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:49.095541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:49.095784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:49.095621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:49.095784 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:49.095740 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 10:06:49.095784 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:49.095775 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:51.0957512 +0000 UTC m=+85.932354372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : configmap references non-existent config key: service-ca.crt Apr 16 10:06:49.095959 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:49.095806 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:51.095795677 +0000 UTC m=+85.932398838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : secret "router-metrics-certs-default" not found Apr 16 10:06:49.196365 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:49.196328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:49.196534 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:49.196504 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 10:06:49.196606 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:49.196573 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls podName:d8417f60-7ed8-4533-89f7-b627ef6d286a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:51.196553161 +0000 UTC m=+86.033156326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6rzg2" (UID: "d8417f60-7ed8-4533-89f7-b627ef6d286a") : secret "samples-operator-tls" not found Apr 16 10:06:50.978220 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:50.978180 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r"] Apr 16 10:06:50.980457 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:50.980436 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r" Apr 16 10:06:50.982618 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:50.982596 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-7ffs9\"" Apr 16 10:06:50.988209 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:50.987983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" event={"ID":"24e1c01b-c627-492b-b514-d6583deef22d","Type":"ContainerStarted","Data":"a0be17286d622c5fe138d80abeb2f6433baf7e2df1b04011baa10b2fc29116c2"} Apr 16 10:06:50.988354 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:50.988328 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r"] Apr 16 10:06:50.989815 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:50.989598 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" event={"ID":"d7b584f5-67ae-4ab7-9eb5-4bd14248b512","Type":"ContainerStarted","Data":"47c8b1f8a924b8f10f37de05774db99857e9bd2bd5d79b538df6da1a8f34adf4"} Apr 16 10:06:50.991244 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:50.991222 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr" event={"ID":"e13bd7e9-dbd9-408e-bd11-e92d36fb4d01","Type":"ContainerStarted","Data":"f71135aa3b5319cabfe885090ae6d97576859e8f14ecc68ce2d5cad9a3fc76f8"} Apr 16 10:06:51.010756 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.010696 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7b6rr" podStartSLOduration=1.4228826749999999 podStartE2EDuration="4.010680069s" podCreationTimestamp="2026-04-16 10:06:47 +0000 UTC" firstStartedPulling="2026-04-16 10:06:47.773033565 +0000 UTC m=+82.609636714" lastFinishedPulling="2026-04-16 10:06:50.36083095 +0000 UTC m=+85.197434108" observedRunningTime="2026-04-16 10:06:51.0104422 +0000 UTC m=+85.847045373" watchObservedRunningTime="2026-04-16 10:06:51.010680069 +0000 UTC m=+85.847283241" Apr 16 10:06:51.012754 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.012727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5lgk\" (UniqueName: \"kubernetes.io/projected/a0d64e3c-30b9-4ca7-b47f-f9c59706692a-kube-api-access-d5lgk\") pod \"network-check-source-7b678d77c7-5qr2r\" (UID: \"a0d64e3c-30b9-4ca7-b47f-f9c59706692a\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r" Apr 16 10:06:51.044540 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.044483 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" podStartSLOduration=1.470528949 podStartE2EDuration="4.044464963s" podCreationTimestamp="2026-04-16 10:06:47 +0000 UTC" firstStartedPulling="2026-04-16 10:06:47.790749422 +0000 UTC m=+82.627352575" lastFinishedPulling="2026-04-16 10:06:50.364685437 +0000 UTC m=+85.201288589" observedRunningTime="2026-04-16 10:06:51.026496401 +0000 UTC m=+85.863099572" watchObservedRunningTime="2026-04-16 10:06:51.044464963 +0000 UTC m=+85.881068135" Apr 16 10:06:51.045401 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.045368 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" podStartSLOduration=1.666290697 podStartE2EDuration="4.045358512s" podCreationTimestamp="2026-04-16 10:06:47 +0000 UTC" firstStartedPulling="2026-04-16 10:06:47.987354923 +0000 UTC m=+82.823958073" lastFinishedPulling="2026-04-16 10:06:50.366422733 +0000 UTC m=+85.203025888" observedRunningTime="2026-04-16 10:06:51.044361358 +0000 UTC m=+85.880964530" watchObservedRunningTime="2026-04-16 10:06:51.045358512 +0000 UTC m=+85.881961685" Apr 16 10:06:51.114140 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.114094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5lgk\" (UniqueName: \"kubernetes.io/projected/a0d64e3c-30b9-4ca7-b47f-f9c59706692a-kube-api-access-d5lgk\") pod \"network-check-source-7b678d77c7-5qr2r\" (UID: \"a0d64e3c-30b9-4ca7-b47f-f9c59706692a\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r" Apr 16 10:06:51.114140 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.114138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:51.114559 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:51.114334 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 10:06:51.114559 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:51.114402 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:55.114381798 +0000 UTC m=+89.950984960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : secret "router-metrics-certs-default" not found Apr 16 10:06:51.114679 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.114571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:51.114735 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:51.114690 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:55.114674351 +0000 UTC m=+89.951277504 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : configmap references non-existent config key: service-ca.crt Apr 16 10:06:51.121917 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.121891 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5lgk\" (UniqueName: \"kubernetes.io/projected/a0d64e3c-30b9-4ca7-b47f-f9c59706692a-kube-api-access-d5lgk\") pod \"network-check-source-7b678d77c7-5qr2r\" (UID: \"a0d64e3c-30b9-4ca7-b47f-f9c59706692a\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r" Apr 16 10:06:51.215544 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.215500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:51.215721 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:51.215620 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 10:06:51.215721 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:51.215674 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls podName:d8417f60-7ed8-4533-89f7-b627ef6d286a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:55.21565925 +0000 UTC m=+90.052262399 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6rzg2" (UID: "d8417f60-7ed8-4533-89f7-b627ef6d286a") : secret "samples-operator-tls" not found Apr 16 10:06:51.291775 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.291687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r" Apr 16 10:06:51.410912 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.410878 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r"] Apr 16 10:06:51.413727 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:06:51.413694 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0d64e3c_30b9_4ca7_b47f_f9c59706692a.slice/crio-c3ca3e74ee958013e79be805fd387aca55027144647455e22e997feb0eca4a3d WatchSource:0}: Error finding container c3ca3e74ee958013e79be805fd387aca55027144647455e22e997feb0eca4a3d: Status 404 returned error can't find the container with id c3ca3e74ee958013e79be805fd387aca55027144647455e22e997feb0eca4a3d Apr 16 10:06:51.994829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.994793 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r" event={"ID":"a0d64e3c-30b9-4ca7-b47f-f9c59706692a","Type":"ContainerStarted","Data":"19580b40b0d56a9ab5b0f0a20f8ea7954ef77b9e5ec3ca8128c88c9476cce362"} Apr 16 10:06:51.994829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:51.994832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r" event={"ID":"a0d64e3c-30b9-4ca7-b47f-f9c59706692a","Type":"ContainerStarted","Data":"c3ca3e74ee958013e79be805fd387aca55027144647455e22e997feb0eca4a3d"} Apr 16 10:06:52.009099 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:52.009047 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-5qr2r" podStartSLOduration=2.009032298 podStartE2EDuration="2.009032298s" podCreationTimestamp="2026-04-16 10:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:06:52.008612809 +0000 UTC m=+86.845215981" watchObservedRunningTime="2026-04-16 10:06:52.009032298 +0000 UTC m=+86.845635470" Apr 16 10:06:53.359563 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:53.359536 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gjdtj_8a82420b-dfa7-4776-ada5-5f24f6e237d2/dns-node-resolver/0.log" Apr 16 10:06:54.560185 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:54.560143 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rv2dg_6f990235-90f2-4344-9fbc-4a60dac858ad/node-ca/0.log" Apr 16 10:06:55.146416 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:55.146371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:55.146635 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:55.146436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:06:55.146635 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:55.146570 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 10:06:55.146635 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:55.146574 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:03.146551983 +0000 UTC m=+97.983155146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : configmap references non-existent config key: service-ca.crt Apr 16 10:06:55.146635 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:55.146629 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs podName:029997ce-820e-46fb-9d03-11be41d65ce4 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:03.14661584 +0000 UTC m=+97.983218989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs") pod "router-default-8677bf5f94-ttzcm" (UID: "029997ce-820e-46fb-9d03-11be41d65ce4") : secret "router-metrics-certs-default" not found Apr 16 10:06:55.247557 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:06:55.247518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:06:55.247731 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:55.247660 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 10:06:55.247731 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:06:55.247727 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls podName:d8417f60-7ed8-4533-89f7-b627ef6d286a nodeName:}" failed. No retries permitted until 2026-04-16 10:07:03.247711466 +0000 UTC m=+98.084314616 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6rzg2" (UID: "d8417f60-7ed8-4533-89f7-b627ef6d286a") : secret "samples-operator-tls" not found Apr 16 10:07:02.403736 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.403703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:07:02.404123 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.403751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:07:02.406067 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.406048 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b517951-6607-404b-bc0f-a66ec956499a-metrics-tls\") pod \"dns-default-qjsr9\" (UID: \"2b517951-6607-404b-bc0f-a66ec956499a\") " pod="openshift-dns/dns-default-qjsr9" Apr 16 10:07:02.406638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.406620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44072043-e38d-468a-ae02-9082c94f67cc-cert\") pod \"ingress-canary-2lw46\" (UID: \"44072043-e38d-468a-ae02-9082c94f67cc\") " pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:07:02.472984 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.472953 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ccsn6\"" Apr 16 10:07:02.479758 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.479735 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cwtl6\"" Apr 16 10:07:02.481392 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.481376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qjsr9" Apr 16 10:07:02.488000 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.487975 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2lw46" Apr 16 10:07:02.608776 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.608596 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qjsr9"] Apr 16 10:07:02.611219 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:02.611196 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b517951_6607_404b_bc0f_a66ec956499a.slice/crio-6db3fa4eed1ff892da8fc7d85c1d2534262b99252d9cf59c85f76909bae65bf2 WatchSource:0}: Error finding container 6db3fa4eed1ff892da8fc7d85c1d2534262b99252d9cf59c85f76909bae65bf2: Status 404 returned error can't find the container with id 6db3fa4eed1ff892da8fc7d85c1d2534262b99252d9cf59c85f76909bae65bf2 Apr 16 10:07:02.623436 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:02.623408 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2lw46"] Apr 16 10:07:02.626029 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:02.626001 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44072043_e38d_468a_ae02_9082c94f67cc.slice/crio-012b97c084015f53c626745e6492222fcb15f1be95a6507072e62d46b9f2510d WatchSource:0}: Error finding container 012b97c084015f53c626745e6492222fcb15f1be95a6507072e62d46b9f2510d: Status 404 returned error can't find the container with id 012b97c084015f53c626745e6492222fcb15f1be95a6507072e62d46b9f2510d Apr 16 10:07:03.020178 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.020126 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qjsr9" event={"ID":"2b517951-6607-404b-bc0f-a66ec956499a","Type":"ContainerStarted","Data":"6db3fa4eed1ff892da8fc7d85c1d2534262b99252d9cf59c85f76909bae65bf2"} Apr 16 10:07:03.021274 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.021244 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2lw46" event={"ID":"44072043-e38d-468a-ae02-9082c94f67cc","Type":"ContainerStarted","Data":"012b97c084015f53c626745e6492222fcb15f1be95a6507072e62d46b9f2510d"} Apr 16 10:07:03.211670 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.211625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:07:03.211842 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.211685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:07:03.212461 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.212416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029997ce-820e-46fb-9d03-11be41d65ce4-service-ca-bundle\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:07:03.214751 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.214719 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/029997ce-820e-46fb-9d03-11be41d65ce4-metrics-certs\") pod \"router-default-8677bf5f94-ttzcm\" (UID: \"029997ce-820e-46fb-9d03-11be41d65ce4\") " pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:07:03.313137 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.313095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:07:03.316356 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.316325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8417f60-7ed8-4533-89f7-b627ef6d286a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6rzg2\" (UID: \"d8417f60-7ed8-4533-89f7-b627ef6d286a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:07:03.349745 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.349481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:07:03.462715 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.462570 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" Apr 16 10:07:03.505898 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.505591 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-8677bf5f94-ttzcm"] Apr 16 10:07:03.508594 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:03.508562 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod029997ce_820e_46fb_9d03_11be41d65ce4.slice/crio-5319638b40ffed6bac5028ef81514c8dca723cb6abfbaf20a8efd7f86a752459 WatchSource:0}: Error finding container 5319638b40ffed6bac5028ef81514c8dca723cb6abfbaf20a8efd7f86a752459: Status 404 returned error can't find the container with id 5319638b40ffed6bac5028ef81514c8dca723cb6abfbaf20a8efd7f86a752459 Apr 16 10:07:03.620219 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:03.620179 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2"] Apr 16 10:07:04.029717 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:04.029629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" event={"ID":"d8417f60-7ed8-4533-89f7-b627ef6d286a","Type":"ContainerStarted","Data":"9b639ef645a3ab56541a95c8320b523c4adc2fd281ce3a50c124f2a45b450565"} Apr 16 10:07:04.031078 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:04.031048 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-8677bf5f94-ttzcm" event={"ID":"029997ce-820e-46fb-9d03-11be41d65ce4","Type":"ContainerStarted","Data":"583d7047c1bdd91668bcd1a8984a6ded95d53813f18fe59873a3ab2b5fa26720"} Apr 16 10:07:04.031078 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:04.031078 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-8677bf5f94-ttzcm" event={"ID":"029997ce-820e-46fb-9d03-11be41d65ce4","Type":"ContainerStarted","Data":"5319638b40ffed6bac5028ef81514c8dca723cb6abfbaf20a8efd7f86a752459"} Apr 16 10:07:04.052099 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:04.052046 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-8677bf5f94-ttzcm" podStartSLOduration=17.052028076 podStartE2EDuration="17.052028076s" podCreationTimestamp="2026-04-16 10:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:07:04.050518009 +0000 UTC m=+98.887121182" watchObservedRunningTime="2026-04-16 10:07:04.052028076 +0000 UTC m=+98.888631247" Apr 16 10:07:04.350598 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:04.350554 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:07:04.353603 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:04.353565 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:07:05.036313 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:05.036208 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2lw46" event={"ID":"44072043-e38d-468a-ae02-9082c94f67cc","Type":"ContainerStarted","Data":"a794e6c42754e7e98e9c95e8fca82f9d40ff7bdd999100a695c60b4b13912a69"} Apr 16 10:07:05.037757 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:05.037728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qjsr9" event={"ID":"2b517951-6607-404b-bc0f-a66ec956499a","Type":"ContainerStarted","Data":"5847b798676fde069e28fdb1e5c874aa5e8a88ccaaa35a45fd5c1cf362dc4aba"} Apr 16 10:07:05.037757 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:05.037758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qjsr9" event={"ID":"2b517951-6607-404b-bc0f-a66ec956499a","Type":"ContainerStarted","Data":"7fe2e93880e5f0d49de0bb93519f9ec98a80e30710eabaa595b93a8294f38297"} Apr 16 10:07:05.037939 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:05.037787 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:07:05.037996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:05.037938 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qjsr9" Apr 16 10:07:05.039058 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:05.039038 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-8677bf5f94-ttzcm" Apr 16 10:07:05.056012 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:05.055974 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2lw46" podStartSLOduration=65.107443461 podStartE2EDuration="1m7.055961375s" podCreationTimestamp="2026-04-16 10:05:58 +0000 UTC" firstStartedPulling="2026-04-16 10:07:02.627985778 +0000 UTC m=+97.464588928" lastFinishedPulling="2026-04-16 10:07:04.576503689 +0000 UTC m=+99.413106842" observedRunningTime="2026-04-16 10:07:05.055730242 +0000 UTC m=+99.892333413" watchObservedRunningTime="2026-04-16 10:07:05.055961375 +0000 UTC m=+99.892564543" Apr 16 10:07:05.099488 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:05.099345 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qjsr9" podStartSLOduration=65.135596845 podStartE2EDuration="1m7.099327319s" podCreationTimestamp="2026-04-16 10:05:58 +0000 UTC" firstStartedPulling="2026-04-16 10:07:02.613076054 +0000 UTC m=+97.449679206" lastFinishedPulling="2026-04-16 10:07:04.57680653 +0000 UTC m=+99.413409680" observedRunningTime="2026-04-16 10:07:05.098484385 +0000 UTC m=+99.935087557" watchObservedRunningTime="2026-04-16 10:07:05.099327319 +0000 UTC m=+99.935930488" Apr 16 10:07:07.044014 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:07.043979 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" event={"ID":"d8417f60-7ed8-4533-89f7-b627ef6d286a","Type":"ContainerStarted","Data":"61379f6d98351154c54c6d153fe7689f1cd1c227d9448c941c733c0bde769762"} Apr 16 10:07:07.044014 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:07.044017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" event={"ID":"d8417f60-7ed8-4533-89f7-b627ef6d286a","Type":"ContainerStarted","Data":"2052b285793890d656bf3e21ec00a12eba5f0d1cfa178486fc698621101fde04"} Apr 16 10:07:07.061173 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:07.061112 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6rzg2" podStartSLOduration=17.66707565 podStartE2EDuration="20.061099303s" podCreationTimestamp="2026-04-16 10:06:47 +0000 UTC" firstStartedPulling="2026-04-16 10:07:03.669446644 +0000 UTC m=+98.506049799" lastFinishedPulling="2026-04-16 10:07:06.0634703 +0000 UTC m=+100.900073452" observedRunningTime="2026-04-16 10:07:07.059986615 +0000 UTC m=+101.896589785" watchObservedRunningTime="2026-04-16 10:07:07.061099303 +0000 UTC m=+101.897702473" Apr 16 10:07:13.957999 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:13.957968 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jjxh6"] Apr 16 10:07:13.961544 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:13.961514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:13.964843 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:13.964818 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vnpkd\"" Apr 16 10:07:13.964843 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:13.964836 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 10:07:13.965022 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:13.964835 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 10:07:13.973986 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:13.973962 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jjxh6"] Apr 16 10:07:14.047506 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.047478 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57d769fd65-pw5bg"] Apr 16 10:07:14.050543 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.050525 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.053032 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.053011 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 10:07:14.053032 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.053013 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kpfjj\"" Apr 16 10:07:14.053299 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.053286 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 10:07:14.053353 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.053336 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 10:07:14.061521 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.061497 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 10:07:14.064015 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.063990 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57d769fd65-pw5bg"] Apr 16 10:07:14.090127 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.090100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.090299 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.090132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.090299 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.090173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-crio-socket\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.090299 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.090239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-data-volume\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.090417 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.090342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b4b4\" (UniqueName: \"kubernetes.io/projected/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-kube-api-access-5b4b4\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.191183 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.191183 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-crio-socket\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.191446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-installation-pull-secrets\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.191446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md48r\" (UniqueName: \"kubernetes.io/projected/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-kube-api-access-md48r\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.191446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191281 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-trusted-ca\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.191446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.191446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-crio-socket\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.191446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-ca-trust-extracted\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.191446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-registry-certificates\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.191446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-data-volume\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.191446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-registry-tls\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.191871 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-bound-sa-token\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.191871 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-image-registry-private-configuration\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.191871 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b4b4\" (UniqueName: \"kubernetes.io/projected/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-kube-api-access-5b4b4\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.191871 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-data-volume\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.191871 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.191797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.193670 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.193654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.204460 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.204435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b4b4\" (UniqueName: \"kubernetes.io/projected/a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd-kube-api-access-5b4b4\") pod \"insights-runtime-extractor-jjxh6\" (UID: \"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd\") " pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.271292 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.271212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jjxh6" Apr 16 10:07:14.292361 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.292333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-md48r\" (UniqueName: \"kubernetes.io/projected/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-kube-api-access-md48r\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.292502 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.292393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-trusted-ca\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.292502 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.292424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-ca-trust-extracted\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.292502 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.292448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-registry-certificates\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.292502 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.292481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-registry-tls\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.292502 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.292502 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-bound-sa-token\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.292772 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.292537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-image-registry-private-configuration\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.292772 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.292580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-installation-pull-secrets\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.292885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.292864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-ca-trust-extracted\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.293438 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.293413 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-registry-certificates\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.293574 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.293489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-trusted-ca\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.294962 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.294931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-installation-pull-secrets\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.295102 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.295085 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-image-registry-private-configuration\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.296016 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.295996 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-registry-tls\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.303049 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.303029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-bound-sa-token\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.303214 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.303195 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-md48r\" (UniqueName: \"kubernetes.io/projected/fc7c6c13-2a7a-4b02-8c2e-c3648a528f96-kube-api-access-md48r\") pod \"image-registry-57d769fd65-pw5bg\" (UID: \"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96\") " pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.370227 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.366362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:14.413655 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.413619 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jjxh6"] Apr 16 10:07:14.416728 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:14.416700 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d5854b_e8b4_4005_8d0d_bdcbabaf64fd.slice/crio-8fb02539f3875aca83b7d176b66fbba0c5b01aa6d330f271b9b0be07ccf70791 WatchSource:0}: Error finding container 8fb02539f3875aca83b7d176b66fbba0c5b01aa6d330f271b9b0be07ccf70791: Status 404 returned error can't find the container with id 8fb02539f3875aca83b7d176b66fbba0c5b01aa6d330f271b9b0be07ccf70791 Apr 16 10:07:14.501181 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:14.501135 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57d769fd65-pw5bg"] Apr 16 10:07:14.504111 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:14.504067 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc7c6c13_2a7a_4b02_8c2e_c3648a528f96.slice/crio-2a53a6a62c0595c792edd4880a7ddc39dcf5649f3675d90bf7cd5b90698a9803 WatchSource:0}: Error finding container 2a53a6a62c0595c792edd4880a7ddc39dcf5649f3675d90bf7cd5b90698a9803: Status 404 returned error can't find the container with id 2a53a6a62c0595c792edd4880a7ddc39dcf5649f3675d90bf7cd5b90698a9803 Apr 16 10:07:15.042050 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:15.042021 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qjsr9" Apr 16 10:07:15.066632 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:15.066576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" event={"ID":"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96","Type":"ContainerStarted","Data":"a837442f038ee221f478134297e255914159420aa9dae0a1d27268bc6c9446b3"} Apr 16 10:07:15.066632 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:15.066622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" event={"ID":"fc7c6c13-2a7a-4b02-8c2e-c3648a528f96","Type":"ContainerStarted","Data":"2a53a6a62c0595c792edd4880a7ddc39dcf5649f3675d90bf7cd5b90698a9803"} Apr 16 10:07:15.066880 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:15.066683 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:15.068097 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:15.068060 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjxh6" event={"ID":"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd","Type":"ContainerStarted","Data":"ac751c930acb75e33ff79cf38badff5b3bda2bd63708040b68774ff8783b4033"} Apr 16 10:07:15.068097 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:15.068090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjxh6" event={"ID":"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd","Type":"ContainerStarted","Data":"8fb02539f3875aca83b7d176b66fbba0c5b01aa6d330f271b9b0be07ccf70791"} Apr 16 10:07:15.087646 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:15.087602 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" podStartSLOduration=1.087585769 podStartE2EDuration="1.087585769s" podCreationTimestamp="2026-04-16 10:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:07:15.084699068 +0000 UTC m=+109.921302263" watchObservedRunningTime="2026-04-16 10:07:15.087585769 +0000 UTC m=+109.924188940" Apr 16 10:07:16.072334 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:16.072292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjxh6" event={"ID":"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd","Type":"ContainerStarted","Data":"a20a453560d883a674b4b069a4fae65261d0b69de0b36782c3d2478830ab0b98"} Apr 16 10:07:17.076286 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:17.076250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjxh6" event={"ID":"a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd","Type":"ContainerStarted","Data":"e15bda5b7fa762249d5e6d8fd5edc139c7143a0c42065cf73c81ffbf8a1192ca"} Apr 16 10:07:17.093214 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:17.093148 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jjxh6" podStartSLOduration=1.90223264 podStartE2EDuration="4.093132939s" podCreationTimestamp="2026-04-16 10:07:13 +0000 UTC" firstStartedPulling="2026-04-16 10:07:14.480624293 +0000 UTC m=+109.317227442" lastFinishedPulling="2026-04-16 10:07:16.671524576 +0000 UTC m=+111.508127741" observedRunningTime="2026-04-16 10:07:17.092641135 +0000 UTC m=+111.929244303" watchObservedRunningTime="2026-04-16 10:07:17.093132939 +0000 UTC m=+111.929736109" Apr 16 10:07:22.008173 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.008135 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6"] Apr 16 10:07:22.012740 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.012722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" Apr 16 10:07:22.015787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.015761 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-7n7dm\"" Apr 16 10:07:22.016273 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.016256 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 10:07:22.019734 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.019713 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6"] Apr 16 10:07:22.153278 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.153246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c0790d3e-5135-4465-9ef5-f83578c6593f-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-s7rt6\" (UID: \"c0790d3e-5135-4465-9ef5-f83578c6593f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" Apr 16 10:07:22.254260 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.254218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c0790d3e-5135-4465-9ef5-f83578c6593f-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-s7rt6\" (UID: \"c0790d3e-5135-4465-9ef5-f83578c6593f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" Apr 16 10:07:22.256570 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.256549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c0790d3e-5135-4465-9ef5-f83578c6593f-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-s7rt6\" (UID: \"c0790d3e-5135-4465-9ef5-f83578c6593f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" Apr 16 10:07:22.322184 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.322135 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" Apr 16 10:07:22.437775 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:22.437746 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6"] Apr 16 10:07:22.440960 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:22.440934 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0790d3e_5135_4465_9ef5_f83578c6593f.slice/crio-8422501bb212e1569684a58f8da384232e0478c51857497b97b1576a711a1068 WatchSource:0}: Error finding container 8422501bb212e1569684a58f8da384232e0478c51857497b97b1576a711a1068: Status 404 returned error can't find the container with id 8422501bb212e1569684a58f8da384232e0478c51857497b97b1576a711a1068 Apr 16 10:07:23.093296 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:23.093258 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" event={"ID":"c0790d3e-5135-4465-9ef5-f83578c6593f","Type":"ContainerStarted","Data":"8422501bb212e1569684a58f8da384232e0478c51857497b97b1576a711a1068"} Apr 16 10:07:24.097031 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:24.096991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" event={"ID":"c0790d3e-5135-4465-9ef5-f83578c6593f","Type":"ContainerStarted","Data":"a00d2e8f5fda95386b5dd1c68ca64e7196559863503f38d2a48196b6e37a5900"} Apr 16 10:07:24.097399 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:24.097215 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" Apr 16 10:07:24.102513 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:24.102488 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" Apr 16 10:07:24.114316 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:24.114266 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-s7rt6" podStartSLOduration=1.866817816 podStartE2EDuration="3.114254873s" podCreationTimestamp="2026-04-16 10:07:21 +0000 UTC" firstStartedPulling="2026-04-16 10:07:22.442822813 +0000 UTC m=+117.279425963" lastFinishedPulling="2026-04-16 10:07:23.690259866 +0000 UTC m=+118.526863020" observedRunningTime="2026-04-16 10:07:24.113621452 +0000 UTC m=+118.950224619" watchObservedRunningTime="2026-04-16 10:07:24.114254873 +0000 UTC m=+118.950858043" Apr 16 10:07:30.444029 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.443992 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5"] Apr 16 10:07:30.448431 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.448412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.451880 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.451852 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 10:07:30.452035 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.451852 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 10:07:30.452035 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.451853 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-ct5ng\"" Apr 16 10:07:30.452244 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.452226 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 10:07:30.452331 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.452228 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 10:07:30.452331 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.452271 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 10:07:30.456397 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.456376 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5"] Apr 16 10:07:30.477268 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.477231 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-pgz48"] Apr 16 10:07:30.480865 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.480842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.483126 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.483106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 10:07:30.483280 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.483130 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-lxft2\"" Apr 16 10:07:30.483662 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.483647 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 10:07:30.483741 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.483650 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 10:07:30.487527 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.487507 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9lz6x"] Apr 16 10:07:30.490551 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.490532 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.492821 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.492799 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 10:07:30.493321 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.493300 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 10:07:30.493426 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.493341 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 10:07:30.493643 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.493627 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7m7d8\"" Apr 16 10:07:30.494110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.494090 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-pgz48"] Apr 16 10:07:30.622760 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.622728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24678d87-1b27-4719-b309-5f2c3c4150d3-metrics-client-ca\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.622760 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.622763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h84x\" (UniqueName: \"kubernetes.io/projected/a6b84455-6de2-4e18-9ce9-064521f88942-kube-api-access-6h84x\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.622989 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.622781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.622989 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.622802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6b84455-6de2-4e18-9ce9-064521f88942-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.622989 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.622884 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-accelerators-collector-config\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.622989 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.622946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fd12477-bb9e-4a4a-9298-5e067ebfd148-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.622989 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.622974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24678d87-1b27-4719-b309-5f2c3c4150d3-root\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.623189 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.622989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76428\" (UniqueName: \"kubernetes.io/projected/24678d87-1b27-4719-b309-5f2c3c4150d3-kube-api-access-76428\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.623189 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fd12477-bb9e-4a4a-9298-5e067ebfd148-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.623189 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-tls\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.623189 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-wtmp\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.623189 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623116 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a6b84455-6de2-4e18-9ce9-064521f88942-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.623189 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a6b84455-6de2-4e18-9ce9-064521f88942-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.623189 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623177 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6b84455-6de2-4e18-9ce9-064521f88942-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.623423 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623194 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp9r7\" (UniqueName: \"kubernetes.io/projected/1fd12477-bb9e-4a4a-9298-5e067ebfd148-kube-api-access-jp9r7\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.623423 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-textfile\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.623423 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fd12477-bb9e-4a4a-9298-5e067ebfd148-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.623423 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a6b84455-6de2-4e18-9ce9-064521f88942-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.623423 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.623306 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24678d87-1b27-4719-b309-5f2c3c4150d3-sys\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724191 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724090 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-textfile\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724191 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fd12477-bb9e-4a4a-9298-5e067ebfd148-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.724191 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a6b84455-6de2-4e18-9ce9-064521f88942-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.724191 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24678d87-1b27-4719-b309-5f2c3c4150d3-sys\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24678d87-1b27-4719-b309-5f2c3c4150d3-metrics-client-ca\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6h84x\" (UniqueName: \"kubernetes.io/projected/a6b84455-6de2-4e18-9ce9-064521f88942-kube-api-access-6h84x\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6b84455-6de2-4e18-9ce9-064521f88942-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24678d87-1b27-4719-b309-5f2c3c4150d3-sys\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-accelerators-collector-config\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fd12477-bb9e-4a4a-9298-5e067ebfd148-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24678d87-1b27-4719-b309-5f2c3c4150d3-root\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724483 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76428\" (UniqueName: \"kubernetes.io/projected/24678d87-1b27-4719-b309-5f2c3c4150d3-kube-api-access-76428\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fd12477-bb9e-4a4a-9298-5e067ebfd148-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.724955 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-tls\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724955 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-wtmp\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.724955 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a6b84455-6de2-4e18-9ce9-064521f88942-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.724955 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a6b84455-6de2-4e18-9ce9-064521f88942-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.724955 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6b84455-6de2-4e18-9ce9-064521f88942-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.724955 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a6b84455-6de2-4e18-9ce9-064521f88942-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.724955 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp9r7\" (UniqueName: \"kubernetes.io/projected/1fd12477-bb9e-4a4a-9298-5e067ebfd148-kube-api-access-jp9r7\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.724955 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.724865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fd12477-bb9e-4a4a-9298-5e067ebfd148-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.725373 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.725029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-wtmp\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.725373 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.725171 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24678d87-1b27-4719-b309-5f2c3c4150d3-root\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.725373 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.725298 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-accelerators-collector-config\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.725373 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.725324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24678d87-1b27-4719-b309-5f2c3c4150d3-metrics-client-ca\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.725579 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.725387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6b84455-6de2-4e18-9ce9-064521f88942-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.725579 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.725449 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-textfile\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.726571 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.726539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a6b84455-6de2-4e18-9ce9-064521f88942-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.727661 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.727634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fd12477-bb9e-4a4a-9298-5e067ebfd148-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.727762 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.727641 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6b84455-6de2-4e18-9ce9-064521f88942-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.727762 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.727693 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.728002 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.727980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fd12477-bb9e-4a4a-9298-5e067ebfd148-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.728087 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.728073 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24678d87-1b27-4719-b309-5f2c3c4150d3-node-exporter-tls\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.728192 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.728149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a6b84455-6de2-4e18-9ce9-064521f88942-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.732210 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.732119 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76428\" (UniqueName: \"kubernetes.io/projected/24678d87-1b27-4719-b309-5f2c3c4150d3-kube-api-access-76428\") pod \"node-exporter-9lz6x\" (UID: \"24678d87-1b27-4719-b309-5f2c3c4150d3\") " pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.732210 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.732143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h84x\" (UniqueName: \"kubernetes.io/projected/a6b84455-6de2-4e18-9ce9-064521f88942-kube-api-access-6h84x\") pod \"kube-state-metrics-7479c89684-pgz48\" (UID: \"a6b84455-6de2-4e18-9ce9-064521f88942\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.732986 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.732966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp9r7\" (UniqueName: \"kubernetes.io/projected/1fd12477-bb9e-4a4a-9298-5e067ebfd148-kube-api-access-jp9r7\") pod \"openshift-state-metrics-5669946b84-z9rb5\" (UID: \"1fd12477-bb9e-4a4a-9298-5e067ebfd148\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.757851 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.757814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" Apr 16 10:07:30.789540 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.789505 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" Apr 16 10:07:30.800070 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.800039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9lz6x" Apr 16 10:07:30.809406 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:30.809365 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24678d87_1b27_4719_b309_5f2c3c4150d3.slice/crio-1065cf03f83e69170234e4139d9f7accbbede1ce613ca38a61dda1a0f815b752 WatchSource:0}: Error finding container 1065cf03f83e69170234e4139d9f7accbbede1ce613ca38a61dda1a0f815b752: Status 404 returned error can't find the container with id 1065cf03f83e69170234e4139d9f7accbbede1ce613ca38a61dda1a0f815b752 Apr 16 10:07:30.899580 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.899549 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5"] Apr 16 10:07:30.902648 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:30.902620 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fd12477_bb9e_4a4a_9298_5e067ebfd148.slice/crio-fcf2e846a3090bcc39f5bedd9917c025a2960ea7707fbf30f70a80b8046aa984 WatchSource:0}: Error finding container fcf2e846a3090bcc39f5bedd9917c025a2960ea7707fbf30f70a80b8046aa984: Status 404 returned error can't find the container with id fcf2e846a3090bcc39f5bedd9917c025a2960ea7707fbf30f70a80b8046aa984 Apr 16 10:07:30.917182 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:30.917131 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-pgz48"] Apr 16 10:07:30.920573 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:30.920546 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6b84455_6de2_4e18_9ce9_064521f88942.slice/crio-2ae5a319f8ec50664eb836a40ec56cb76ac723facba97b6835d17516927ca481 WatchSource:0}: Error finding container 2ae5a319f8ec50664eb836a40ec56cb76ac723facba97b6835d17516927ca481: Status 404 returned error can't find the container with id 2ae5a319f8ec50664eb836a40ec56cb76ac723facba97b6835d17516927ca481 Apr 16 10:07:31.120498 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.120460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" event={"ID":"1fd12477-bb9e-4a4a-9298-5e067ebfd148","Type":"ContainerStarted","Data":"7dd6a661e7ed0a815840dd3e7398967e9de5a70247d546a306e4dd9809b61d02"} Apr 16 10:07:31.120498 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.120502 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" event={"ID":"1fd12477-bb9e-4a4a-9298-5e067ebfd148","Type":"ContainerStarted","Data":"dde2230de9d39340545ce97cfb21f13330086dcbd6f95d25595e923982ca6a6c"} Apr 16 10:07:31.120727 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.120517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" event={"ID":"1fd12477-bb9e-4a4a-9298-5e067ebfd148","Type":"ContainerStarted","Data":"fcf2e846a3090bcc39f5bedd9917c025a2960ea7707fbf30f70a80b8046aa984"} Apr 16 10:07:31.121516 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.121488 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" event={"ID":"a6b84455-6de2-4e18-9ce9-064521f88942","Type":"ContainerStarted","Data":"2ae5a319f8ec50664eb836a40ec56cb76ac723facba97b6835d17516927ca481"} Apr 16 10:07:31.122379 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.122348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9lz6x" event={"ID":"24678d87-1b27-4719-b309-5f2c3c4150d3","Type":"ContainerStarted","Data":"1065cf03f83e69170234e4139d9f7accbbede1ce613ca38a61dda1a0f815b752"} Apr 16 10:07:31.515815 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.515780 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 10:07:31.519353 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.519328 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.525981 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.525931 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 10:07:31.526109 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.525931 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 10:07:31.526223 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.526204 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 10:07:31.526291 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.526210 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m5f6n\"" Apr 16 10:07:31.526345 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.526313 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 10:07:31.526398 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.526343 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 10:07:31.526398 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.526362 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 10:07:31.526506 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.526492 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 10:07:31.526778 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.526582 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 10:07:31.526778 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.526710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 10:07:31.536049 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.536028 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 10:07:31.632668 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632635 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.632668 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.632888 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.632888 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632777 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.632888 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-web-config\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.632888 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdlbd\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-kube-api-access-pdlbd\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.632888 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-volume\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.633056 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.633056 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632961 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-out\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.633056 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.632999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.633056 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.633019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.633056 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.633037 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.633056 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.633053 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.733666 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.733634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-volume\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.733842 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.733686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.733922 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.733898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-out\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.733977 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.733944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734032 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.733976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734032 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.734007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734137 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.734036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734137 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.734070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734137 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.734094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734137 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.734116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734137 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.734132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.734212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.734244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-web-config\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734424 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:07:31.734275 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 10:07:31.734424 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.734310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdlbd\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-kube-api-access-pdlbd\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.734424 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:07:31.734374 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-main-tls podName:d0a1bf25-90fd-407a-b6d7-c07125231ae6 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:32.23435325 +0000 UTC m=+127.070956403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6") : secret "alertmanager-main-tls" not found Apr 16 10:07:31.735676 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.735616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.735939 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.735885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.737430 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.737380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.737526 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.737468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-web-config\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.737602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.737528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.738355 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.738310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.738355 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.738342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.738579 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.738559 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.738907 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.738877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-out\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.739043 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.739023 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-volume\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:31.744670 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:31.744648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdlbd\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-kube-api-access-pdlbd\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:32.127023 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:32.126987 2575 generic.go:358] "Generic (PLEG): container finished" podID="24678d87-1b27-4719-b309-5f2c3c4150d3" containerID="fdb449764c36c87bb566a595f903f5eb3559e31e835b7757c34d1bff19c7e2f8" exitCode=0 Apr 16 10:07:32.127255 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:32.127040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9lz6x" event={"ID":"24678d87-1b27-4719-b309-5f2c3c4150d3","Type":"ContainerDied","Data":"fdb449764c36c87bb566a595f903f5eb3559e31e835b7757c34d1bff19c7e2f8"} Apr 16 10:07:32.239608 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:32.239566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:32.241983 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:32.241947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:32.429251 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:32.429163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:07:32.761231 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:32.756020 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 10:07:33.132018 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.131978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" event={"ID":"a6b84455-6de2-4e18-9ce9-064521f88942","Type":"ContainerStarted","Data":"1fe27e933d27c5ddd349289c8f9cf9fd28717418184f0b0497e8f6ee08e022a7"} Apr 16 10:07:33.132018 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.132023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" event={"ID":"a6b84455-6de2-4e18-9ce9-064521f88942","Type":"ContainerStarted","Data":"31f7ff68f0916ef1482a916a8d2984cfdde3c2b5301e6a5f51c309c827715543"} Apr 16 10:07:33.132310 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.132038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" event={"ID":"a6b84455-6de2-4e18-9ce9-064521f88942","Type":"ContainerStarted","Data":"9f6056ae90a9f20d5c59547984c23110872f5c1eeda8a4f271bcdc101881aa32"} Apr 16 10:07:33.133839 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.133809 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9lz6x" event={"ID":"24678d87-1b27-4719-b309-5f2c3c4150d3","Type":"ContainerStarted","Data":"6c9fd98b3fe21b864d18ebd77769eca70107a98545efc5e6f5d44618f7d8e625"} Apr 16 10:07:33.133965 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.133843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9lz6x" event={"ID":"24678d87-1b27-4719-b309-5f2c3c4150d3","Type":"ContainerStarted","Data":"a1d621c56336e5889f4b923fece2f8425f8f3f8bf9150d043fa1926b9f9980db"} Apr 16 10:07:33.135695 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.135671 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" event={"ID":"1fd12477-bb9e-4a4a-9298-5e067ebfd148","Type":"ContainerStarted","Data":"bfbbcdc63f298991068c6a3e4bfb243e594a4b8d5ddb1a923d72340f6315d713"} Apr 16 10:07:33.136697 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.136678 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerStarted","Data":"e627abfffde50f963af8eef316da212a8138416bad07efbbc70740fde1ddb4f5"} Apr 16 10:07:33.154830 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.154786 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-pgz48" podStartSLOduration=1.486432154 podStartE2EDuration="3.154772901s" podCreationTimestamp="2026-04-16 10:07:30 +0000 UTC" firstStartedPulling="2026-04-16 10:07:30.922445737 +0000 UTC m=+125.759048889" lastFinishedPulling="2026-04-16 10:07:32.590786486 +0000 UTC m=+127.427389636" observedRunningTime="2026-04-16 10:07:33.153560525 +0000 UTC m=+127.990163710" watchObservedRunningTime="2026-04-16 10:07:33.154772901 +0000 UTC m=+127.991376071" Apr 16 10:07:33.178440 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.178392 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9rb5" podStartSLOduration=1.61182015 podStartE2EDuration="3.178378911s" podCreationTimestamp="2026-04-16 10:07:30 +0000 UTC" firstStartedPulling="2026-04-16 10:07:31.02104454 +0000 UTC m=+125.857647689" lastFinishedPulling="2026-04-16 10:07:32.587603282 +0000 UTC m=+127.424206450" observedRunningTime="2026-04-16 10:07:33.177380348 +0000 UTC m=+128.013983519" watchObservedRunningTime="2026-04-16 10:07:33.178378911 +0000 UTC m=+128.014982116" Apr 16 10:07:33.205265 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:33.205211 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9lz6x" podStartSLOduration=2.480497476 podStartE2EDuration="3.20519538s" podCreationTimestamp="2026-04-16 10:07:30 +0000 UTC" firstStartedPulling="2026-04-16 10:07:30.81187843 +0000 UTC m=+125.648481585" lastFinishedPulling="2026-04-16 10:07:31.536576323 +0000 UTC m=+126.373179489" observedRunningTime="2026-04-16 10:07:33.203134232 +0000 UTC m=+128.039737403" watchObservedRunningTime="2026-04-16 10:07:33.20519538 +0000 UTC m=+128.041798551" Apr 16 10:07:35.144021 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.143981 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerID="80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd" exitCode=0 Apr 16 10:07:35.144426 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.144067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerDied","Data":"80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd"} Apr 16 10:07:35.200396 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.200361 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7"] Apr 16 10:07:35.203597 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.203576 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" Apr 16 10:07:35.206331 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.206305 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 10:07:35.206453 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.206338 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-m7w45\"" Apr 16 10:07:35.211960 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.211937 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7"] Apr 16 10:07:35.374624 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.374583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5d767b9e-377e-4435-b350-6d44a9d2b985-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-gd6c7\" (UID: \"5d767b9e-377e-4435-b350-6d44a9d2b985\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" Apr 16 10:07:35.374789 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.374646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:07:35.377028 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.377009 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1591f776-8015-497b-a4bf-80b359c62427-metrics-certs\") pod \"network-metrics-daemon-8r8s4\" (UID: \"1591f776-8015-497b-a4bf-80b359c62427\") " pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:07:35.475208 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.475093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5d767b9e-377e-4435-b350-6d44a9d2b985-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-gd6c7\" (UID: \"5d767b9e-377e-4435-b350-6d44a9d2b985\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" Apr 16 10:07:35.475363 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:07:35.475264 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 10:07:35.475363 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:07:35.475334 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d767b9e-377e-4435-b350-6d44a9d2b985-monitoring-plugin-cert podName:5d767b9e-377e-4435-b350-6d44a9d2b985 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:35.975317159 +0000 UTC m=+130.811920312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/5d767b9e-377e-4435-b350-6d44a9d2b985-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-gd6c7" (UID: "5d767b9e-377e-4435-b350-6d44a9d2b985") : secret "monitoring-plugin-cert" not found Apr 16 10:07:35.650276 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.650239 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-btvgh\"" Apr 16 10:07:35.658199 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.658168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8r8s4" Apr 16 10:07:35.674228 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.674187 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-75f64b965c-tk4lz"] Apr 16 10:07:35.715729 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.714063 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75f64b965c-tk4lz"] Apr 16 10:07:35.715729 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.714302 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.717795 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.717435 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-j6w4d\"" Apr 16 10:07:35.717795 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.717715 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 10:07:35.718081 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.718063 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 10:07:35.718300 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.718099 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 10:07:35.718399 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.718378 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 10:07:35.718465 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.718430 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 10:07:35.723909 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.723396 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 10:07:35.812451 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.812369 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8r8s4"] Apr 16 10:07:35.879951 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.879912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2faa2477-0a90-47a6-b2fd-ef41b76224f7-metrics-client-ca\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.880137 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.879962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faa2477-0a90-47a6-b2fd-ef41b76224f7-serving-certs-ca-bundle\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.880137 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.880050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-telemeter-client-tls\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.880296 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.880134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-federate-client-tls\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.880296 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.880183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faa2477-0a90-47a6-b2fd-ef41b76224f7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.880296 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.880210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvxcr\" (UniqueName: \"kubernetes.io/projected/2faa2477-0a90-47a6-b2fd-ef41b76224f7-kube-api-access-rvxcr\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.880296 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.880251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.880513 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.880302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-secret-telemeter-client\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.981454 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.981419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvxcr\" (UniqueName: \"kubernetes.io/projected/2faa2477-0a90-47a6-b2fd-ef41b76224f7-kube-api-access-rvxcr\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.981643 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.981462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.981643 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.981498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-secret-telemeter-client\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.981643 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.981532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2faa2477-0a90-47a6-b2fd-ef41b76224f7-metrics-client-ca\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.981643 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.981562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faa2477-0a90-47a6-b2fd-ef41b76224f7-serving-certs-ca-bundle\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.981643 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.981607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5d767b9e-377e-4435-b350-6d44a9d2b985-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-gd6c7\" (UID: \"5d767b9e-377e-4435-b350-6d44a9d2b985\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" Apr 16 10:07:35.981879 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.981731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-telemeter-client-tls\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.981879 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.981869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-federate-client-tls\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.981961 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.981902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faa2477-0a90-47a6-b2fd-ef41b76224f7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.982386 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.982358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2faa2477-0a90-47a6-b2fd-ef41b76224f7-metrics-client-ca\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.984402 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.984373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-secret-telemeter-client\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.984504 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.984430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-federate-client-tls\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.984504 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.984435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-telemeter-client-tls\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.984616 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.984594 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2faa2477-0a90-47a6-b2fd-ef41b76224f7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.984654 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.984645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5d767b9e-377e-4435-b350-6d44a9d2b985-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-gd6c7\" (UID: \"5d767b9e-377e-4435-b350-6d44a9d2b985\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" Apr 16 10:07:35.989535 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.989510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvxcr\" (UniqueName: \"kubernetes.io/projected/2faa2477-0a90-47a6-b2fd-ef41b76224f7-kube-api-access-rvxcr\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.993695 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.993669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faa2477-0a90-47a6-b2fd-ef41b76224f7-serving-certs-ca-bundle\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:35.994062 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:35.994039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faa2477-0a90-47a6-b2fd-ef41b76224f7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75f64b965c-tk4lz\" (UID: \"2faa2477-0a90-47a6-b2fd-ef41b76224f7\") " pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:36.026118 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:36.026041 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" Apr 16 10:07:36.078864 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:36.078839 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57d769fd65-pw5bg" Apr 16 10:07:36.113467 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:36.113432 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" Apr 16 10:07:36.151853 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:36.151815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8r8s4" event={"ID":"1591f776-8015-497b-a4bf-80b359c62427","Type":"ContainerStarted","Data":"eeb55fc5f3316b06615f14121b4e0ef85cde0e5b186b7d34518b31de7f672724"} Apr 16 10:07:36.165701 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:36.165629 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75f64b965c-tk4lz"] Apr 16 10:07:36.170644 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:36.170600 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2faa2477_0a90_47a6_b2fd_ef41b76224f7.slice/crio-83349c71188815cbdf33b0514267de8d8a393a3521408c78467dc0a4f9f50ba5 WatchSource:0}: Error finding container 83349c71188815cbdf33b0514267de8d8a393a3521408c78467dc0a4f9f50ba5: Status 404 returned error can't find the container with id 83349c71188815cbdf33b0514267de8d8a393a3521408c78467dc0a4f9f50ba5 Apr 16 10:07:36.289252 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:36.289215 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7"] Apr 16 10:07:36.291672 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:36.291637 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d767b9e_377e_4435_b350_6d44a9d2b985.slice/crio-c0b4cbbd9dd44c1cecad7d04f92bf37771d833a4c2f6121cc3fd22302ed0e0a3 WatchSource:0}: Error finding container c0b4cbbd9dd44c1cecad7d04f92bf37771d833a4c2f6121cc3fd22302ed0e0a3: Status 404 returned error can't find the container with id c0b4cbbd9dd44c1cecad7d04f92bf37771d833a4c2f6121cc3fd22302ed0e0a3 Apr 16 10:07:37.159091 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:37.159049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" event={"ID":"5d767b9e-377e-4435-b350-6d44a9d2b985","Type":"ContainerStarted","Data":"c0b4cbbd9dd44c1cecad7d04f92bf37771d833a4c2f6121cc3fd22302ed0e0a3"} Apr 16 10:07:37.160377 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:37.160347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" event={"ID":"2faa2477-0a90-47a6-b2fd-ef41b76224f7","Type":"ContainerStarted","Data":"83349c71188815cbdf33b0514267de8d8a393a3521408c78467dc0a4f9f50ba5"} Apr 16 10:07:38.167372 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.167332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerStarted","Data":"b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e"} Apr 16 10:07:38.167372 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.167378 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerStarted","Data":"6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91"} Apr 16 10:07:38.167819 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.167390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerStarted","Data":"7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1"} Apr 16 10:07:38.167819 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.167398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerStarted","Data":"195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc"} Apr 16 10:07:38.167819 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.167406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerStarted","Data":"c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306"} Apr 16 10:07:38.168890 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.168864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8r8s4" event={"ID":"1591f776-8015-497b-a4bf-80b359c62427","Type":"ContainerStarted","Data":"4d331d3d4a075af31f6b96ce434ba1ea9e7ac3cec57c0575471cbbc9e27a19a6"} Apr 16 10:07:38.169102 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.168897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8r8s4" event={"ID":"1591f776-8015-497b-a4bf-80b359c62427","Type":"ContainerStarted","Data":"b74f2d2552b9a3a02c18d1e5a2918eab0f1a6f4e64ae3bd1f88319c421de8842"} Apr 16 10:07:38.170348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.170313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" event={"ID":"5d767b9e-377e-4435-b350-6d44a9d2b985","Type":"ContainerStarted","Data":"067842da15842c8283d6e7110ddaf82c37af7f49c4bd4e43d023e37e5b196d15"} Apr 16 10:07:38.170558 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.170527 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" Apr 16 10:07:38.176337 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.176313 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" Apr 16 10:07:38.187415 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.187369 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8r8s4" podStartSLOduration=131.751029642 podStartE2EDuration="2m13.187355399s" podCreationTimestamp="2026-04-16 10:05:25 +0000 UTC" firstStartedPulling="2026-04-16 10:07:35.819379201 +0000 UTC m=+130.655982360" lastFinishedPulling="2026-04-16 10:07:37.255704957 +0000 UTC m=+132.092308117" observedRunningTime="2026-04-16 10:07:38.186028093 +0000 UTC m=+133.022631268" watchObservedRunningTime="2026-04-16 10:07:38.187355399 +0000 UTC m=+133.023958571" Apr 16 10:07:38.201684 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:38.201586 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-gd6c7" podStartSLOduration=1.5070230009999999 podStartE2EDuration="3.201574238s" podCreationTimestamp="2026-04-16 10:07:35 +0000 UTC" firstStartedPulling="2026-04-16 10:07:36.294546284 +0000 UTC m=+131.131149438" lastFinishedPulling="2026-04-16 10:07:37.989097511 +0000 UTC m=+132.825700675" observedRunningTime="2026-04-16 10:07:38.1999092 +0000 UTC m=+133.036512373" watchObservedRunningTime="2026-04-16 10:07:38.201574238 +0000 UTC m=+133.038177408" Apr 16 10:07:39.175213 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:39.175185 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" event={"ID":"2faa2477-0a90-47a6-b2fd-ef41b76224f7","Type":"ContainerStarted","Data":"d55727ec09b4e976bad3c1069fff9100b355be4e57a03da52fb4e315f3f2abfe"} Apr 16 10:07:39.175553 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:39.175221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" event={"ID":"2faa2477-0a90-47a6-b2fd-ef41b76224f7","Type":"ContainerStarted","Data":"51f4a7622504c43dc6fe362336b97107de0d33d03cebebcc0c3db97492da5567"} Apr 16 10:07:40.180956 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:40.180917 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerStarted","Data":"3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a"} Apr 16 10:07:40.182853 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:40.182828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" event={"ID":"2faa2477-0a90-47a6-b2fd-ef41b76224f7","Type":"ContainerStarted","Data":"458ea8f1377c7fc457bdd3a88630f19cf17a5372c9c4d0e66fcce23f39ce3e86"} Apr 16 10:07:40.213683 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:40.213534 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.76838876 podStartE2EDuration="9.213515849s" podCreationTimestamp="2026-04-16 10:07:31 +0000 UTC" firstStartedPulling="2026-04-16 10:07:32.773673154 +0000 UTC m=+127.610276309" lastFinishedPulling="2026-04-16 10:07:39.218800237 +0000 UTC m=+134.055403398" observedRunningTime="2026-04-16 10:07:40.211426952 +0000 UTC m=+135.048030127" watchObservedRunningTime="2026-04-16 10:07:40.213515849 +0000 UTC m=+135.050119021" Apr 16 10:07:40.233774 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:40.233722 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-75f64b965c-tk4lz" podStartSLOduration=2.504220331 podStartE2EDuration="5.233702058s" podCreationTimestamp="2026-04-16 10:07:35 +0000 UTC" firstStartedPulling="2026-04-16 10:07:36.173478891 +0000 UTC m=+131.010082046" lastFinishedPulling="2026-04-16 10:07:38.902960621 +0000 UTC m=+133.739563773" observedRunningTime="2026-04-16 10:07:40.231571022 +0000 UTC m=+135.068174193" watchObservedRunningTime="2026-04-16 10:07:40.233702058 +0000 UTC m=+135.070305229" Apr 16 10:07:42.953477 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.953443 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7854f46d94-4xbhc"] Apr 16 10:07:42.958050 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.958028 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:42.960366 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.960342 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 10:07:42.960458 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.960404 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 10:07:42.961429 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.961408 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 10:07:42.961538 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.961411 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 10:07:42.961538 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.961441 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 10:07:42.961617 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.961608 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-psr7l\"" Apr 16 10:07:42.961793 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.961773 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 10:07:42.961793 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.961789 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 10:07:42.969666 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:42.969646 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7854f46d94-4xbhc"] Apr 16 10:07:43.044259 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.044216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-serving-cert\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.044418 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.044326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-console-config\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.044418 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.044394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-oauth-serving-cert\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.044518 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.044427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-service-ca\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.044518 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.044443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glbpf\" (UniqueName: \"kubernetes.io/projected/43b6e083-34ec-4128-8bb7-abce70cd9138-kube-api-access-glbpf\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.044518 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.044465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-oauth-config\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.145280 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.145245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-console-config\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.145468 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.145303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-oauth-serving-cert\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.145468 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.145332 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-service-ca\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.145468 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.145349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glbpf\" (UniqueName: \"kubernetes.io/projected/43b6e083-34ec-4128-8bb7-abce70cd9138-kube-api-access-glbpf\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.145468 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.145369 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-oauth-config\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.145468 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.145399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-serving-cert\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.146062 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.146039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-oauth-serving-cert\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.146176 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.146079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-console-config\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.146176 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.146079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-service-ca\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.147910 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.147883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-oauth-config\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.147910 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.147888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-serving-cert\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.152556 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.152539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glbpf\" (UniqueName: \"kubernetes.io/projected/43b6e083-34ec-4128-8bb7-abce70cd9138-kube-api-access-glbpf\") pod \"console-7854f46d94-4xbhc\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.267366 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.267274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:43.392465 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:43.392438 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7854f46d94-4xbhc"] Apr 16 10:07:43.394691 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:43.394668 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43b6e083_34ec_4128_8bb7_abce70cd9138.slice/crio-bf403412e1ee245496db3f9f111147de3287887db4c1182f1794a054d05de19d WatchSource:0}: Error finding container bf403412e1ee245496db3f9f111147de3287887db4c1182f1794a054d05de19d: Status 404 returned error can't find the container with id bf403412e1ee245496db3f9f111147de3287887db4c1182f1794a054d05de19d Apr 16 10:07:44.197930 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:44.197886 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7854f46d94-4xbhc" event={"ID":"43b6e083-34ec-4128-8bb7-abce70cd9138","Type":"ContainerStarted","Data":"bf403412e1ee245496db3f9f111147de3287887db4c1182f1794a054d05de19d"} Apr 16 10:07:47.207622 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:47.207584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7854f46d94-4xbhc" event={"ID":"43b6e083-34ec-4128-8bb7-abce70cd9138","Type":"ContainerStarted","Data":"ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d"} Apr 16 10:07:47.229409 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:47.229359 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7854f46d94-4xbhc" podStartSLOduration=2.015125675 podStartE2EDuration="5.229343263s" podCreationTimestamp="2026-04-16 10:07:42 +0000 UTC" firstStartedPulling="2026-04-16 10:07:43.396988125 +0000 UTC m=+138.233591275" lastFinishedPulling="2026-04-16 10:07:46.611205701 +0000 UTC m=+141.447808863" observedRunningTime="2026-04-16 10:07:47.227443102 +0000 UTC m=+142.064046294" watchObservedRunningTime="2026-04-16 10:07:47.229343263 +0000 UTC m=+142.065946433" Apr 16 10:07:51.333192 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.333138 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-578b6f7475-vjjmq"] Apr 16 10:07:51.338566 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.338544 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.346787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.346764 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 10:07:51.347483 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.347460 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-578b6f7475-vjjmq"] Apr 16 10:07:51.418513 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.418472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-oauth-serving-cert\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.418702 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.418527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-trusted-ca-bundle\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.418702 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.418606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-oauth-config\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.418702 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.418661 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-serving-cert\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.418702 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.418689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2lgc\" (UniqueName: \"kubernetes.io/projected/7ee30a63-6480-4107-8a75-b94d7143af5b-kube-api-access-n2lgc\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.418835 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.418716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-console-config\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.418835 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.418770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-service-ca\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.519473 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.519419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-service-ca\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.519637 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.519510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-oauth-serving-cert\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.519637 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.519543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-trusted-ca-bundle\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.519637 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.519580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-oauth-config\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.519637 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.519627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-serving-cert\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.519782 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.519657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2lgc\" (UniqueName: \"kubernetes.io/projected/7ee30a63-6480-4107-8a75-b94d7143af5b-kube-api-access-n2lgc\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.519918 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.519891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-console-config\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.520305 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.520264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-service-ca\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.520456 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.520430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-oauth-serving-cert\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.520584 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.520563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-trusted-ca-bundle\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.520648 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.520612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-console-config\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.522085 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.522062 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-serving-cert\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.522173 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.522125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-oauth-config\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.529195 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.529149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2lgc\" (UniqueName: \"kubernetes.io/projected/7ee30a63-6480-4107-8a75-b94d7143af5b-kube-api-access-n2lgc\") pod \"console-578b6f7475-vjjmq\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.649594 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.649509 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:07:51.774904 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:51.774877 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-578b6f7475-vjjmq"] Apr 16 10:07:51.777317 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:07:51.777281 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ee30a63_6480_4107_8a75_b94d7143af5b.slice/crio-b7dc375e8c4a4feced8173b80d90dfeac51010194f57403f9791c4d0d58d9115 WatchSource:0}: Error finding container b7dc375e8c4a4feced8173b80d90dfeac51010194f57403f9791c4d0d58d9115: Status 404 returned error can't find the container with id b7dc375e8c4a4feced8173b80d90dfeac51010194f57403f9791c4d0d58d9115 Apr 16 10:07:52.224321 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:52.224281 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578b6f7475-vjjmq" event={"ID":"7ee30a63-6480-4107-8a75-b94d7143af5b","Type":"ContainerStarted","Data":"c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2"} Apr 16 10:07:52.224321 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:52.224326 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578b6f7475-vjjmq" event={"ID":"7ee30a63-6480-4107-8a75-b94d7143af5b","Type":"ContainerStarted","Data":"b7dc375e8c4a4feced8173b80d90dfeac51010194f57403f9791c4d0d58d9115"} Apr 16 10:07:52.242873 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:52.242806 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-578b6f7475-vjjmq" podStartSLOduration=1.242791538 podStartE2EDuration="1.242791538s" podCreationTimestamp="2026-04-16 10:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:07:52.241527367 +0000 UTC m=+147.078130551" watchObservedRunningTime="2026-04-16 10:07:52.242791538 +0000 UTC m=+147.079394710" Apr 16 10:07:53.268225 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:53.268185 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:53.268225 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:53.268226 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:53.272835 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:53.272812 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:07:54.234080 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:07:54.234050 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:08:01.650517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:01.650472 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:08:01.650517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:01.650521 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:08:01.655409 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:01.655381 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:08:02.257897 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:02.257869 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:08:02.311925 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:02.311897 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7854f46d94-4xbhc"] Apr 16 10:08:06.267021 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:06.266986 2575 generic.go:358] "Generic (PLEG): container finished" podID="d7b584f5-67ae-4ab7-9eb5-4bd14248b512" containerID="47c8b1f8a924b8f10f37de05774db99857e9bd2bd5d79b538df6da1a8f34adf4" exitCode=0 Apr 16 10:08:06.267398 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:06.267034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" event={"ID":"d7b584f5-67ae-4ab7-9eb5-4bd14248b512","Type":"ContainerDied","Data":"47c8b1f8a924b8f10f37de05774db99857e9bd2bd5d79b538df6da1a8f34adf4"} Apr 16 10:08:06.267454 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:06.267437 2575 scope.go:117] "RemoveContainer" containerID="47c8b1f8a924b8f10f37de05774db99857e9bd2bd5d79b538df6da1a8f34adf4" Apr 16 10:08:06.882431 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:06.882398 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8677bf5f94-ttzcm_029997ce-820e-46fb-9d03-11be41d65ce4/router/0.log" Apr 16 10:08:06.887498 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:06.887471 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2lw46_44072043-e38d-468a-ae02-9082c94f67cc/serve-healthcheck-canary/0.log" Apr 16 10:08:07.271669 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:07.271560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fcg2p" event={"ID":"d7b584f5-67ae-4ab7-9eb5-4bd14248b512","Type":"ContainerStarted","Data":"6134de4b3a37114302c6a6d1249172880ad86f2e9bc1859b41b362f7ca98abd5"} Apr 16 10:08:11.284112 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:11.284078 2575 generic.go:358] "Generic (PLEG): container finished" podID="24e1c01b-c627-492b-b514-d6583deef22d" containerID="a0be17286d622c5fe138d80abeb2f6433baf7e2df1b04011baa10b2fc29116c2" exitCode=0 Apr 16 10:08:11.284511 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:11.284176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" event={"ID":"24e1c01b-c627-492b-b514-d6583deef22d","Type":"ContainerDied","Data":"a0be17286d622c5fe138d80abeb2f6433baf7e2df1b04011baa10b2fc29116c2"} Apr 16 10:08:11.284554 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:11.284545 2575 scope.go:117] "RemoveContainer" containerID="a0be17286d622c5fe138d80abeb2f6433baf7e2df1b04011baa10b2fc29116c2" Apr 16 10:08:12.288923 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:12.288887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-dbqbq" event={"ID":"24e1c01b-c627-492b-b514-d6583deef22d","Type":"ContainerStarted","Data":"ef334600681a9c0f0471534e31f31805804a9ed45f139c88568b9b4b31f78f29"} Apr 16 10:08:27.334189 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.334099 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7854f46d94-4xbhc" podUID="43b6e083-34ec-4128-8bb7-abce70cd9138" containerName="console" containerID="cri-o://ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d" gracePeriod=15 Apr 16 10:08:27.581048 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.581016 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7854f46d94-4xbhc_43b6e083-34ec-4128-8bb7-abce70cd9138/console/0.log" Apr 16 10:08:27.581199 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.581089 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:08:27.740737 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.740656 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-console-config\") pod \"43b6e083-34ec-4128-8bb7-abce70cd9138\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " Apr 16 10:08:27.740737 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.740714 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-service-ca\") pod \"43b6e083-34ec-4128-8bb7-abce70cd9138\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " Apr 16 10:08:27.740936 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.740739 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-serving-cert\") pod \"43b6e083-34ec-4128-8bb7-abce70cd9138\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " Apr 16 10:08:27.740936 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.740760 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glbpf\" (UniqueName: \"kubernetes.io/projected/43b6e083-34ec-4128-8bb7-abce70cd9138-kube-api-access-glbpf\") pod \"43b6e083-34ec-4128-8bb7-abce70cd9138\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " Apr 16 10:08:27.740936 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.740836 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-oauth-serving-cert\") pod \"43b6e083-34ec-4128-8bb7-abce70cd9138\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " Apr 16 10:08:27.740936 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.740880 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-oauth-config\") pod \"43b6e083-34ec-4128-8bb7-abce70cd9138\" (UID: \"43b6e083-34ec-4128-8bb7-abce70cd9138\") " Apr 16 10:08:27.741128 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.741053 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-console-config" (OuterVolumeSpecName: "console-config") pod "43b6e083-34ec-4128-8bb7-abce70cd9138" (UID: "43b6e083-34ec-4128-8bb7-abce70cd9138"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:27.741235 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.741212 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-service-ca" (OuterVolumeSpecName: "service-ca") pod "43b6e083-34ec-4128-8bb7-abce70cd9138" (UID: "43b6e083-34ec-4128-8bb7-abce70cd9138"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:27.741300 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.741255 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43b6e083-34ec-4128-8bb7-abce70cd9138" (UID: "43b6e083-34ec-4128-8bb7-abce70cd9138"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:27.743126 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.743097 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43b6e083-34ec-4128-8bb7-abce70cd9138" (UID: "43b6e083-34ec-4128-8bb7-abce70cd9138"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:27.743311 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.743176 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43b6e083-34ec-4128-8bb7-abce70cd9138" (UID: "43b6e083-34ec-4128-8bb7-abce70cd9138"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:27.743311 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.743225 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b6e083-34ec-4128-8bb7-abce70cd9138-kube-api-access-glbpf" (OuterVolumeSpecName: "kube-api-access-glbpf") pod "43b6e083-34ec-4128-8bb7-abce70cd9138" (UID: "43b6e083-34ec-4128-8bb7-abce70cd9138"). InnerVolumeSpecName "kube-api-access-glbpf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:08:27.841902 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.841863 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-oauth-config\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:27.841902 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.841893 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-console-config\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:27.841902 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.841903 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-service-ca\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:27.842235 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.841915 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43b6e083-34ec-4128-8bb7-abce70cd9138-console-serving-cert\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:27.842235 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.841929 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-glbpf\" (UniqueName: \"kubernetes.io/projected/43b6e083-34ec-4128-8bb7-abce70cd9138-kube-api-access-glbpf\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:27.842235 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:27.841942 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43b6e083-34ec-4128-8bb7-abce70cd9138-oauth-serving-cert\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:28.343133 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.343105 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7854f46d94-4xbhc_43b6e083-34ec-4128-8bb7-abce70cd9138/console/0.log" Apr 16 10:08:28.343547 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.343150 2575 generic.go:358] "Generic (PLEG): container finished" podID="43b6e083-34ec-4128-8bb7-abce70cd9138" containerID="ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d" exitCode=2 Apr 16 10:08:28.343547 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.343207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7854f46d94-4xbhc" event={"ID":"43b6e083-34ec-4128-8bb7-abce70cd9138","Type":"ContainerDied","Data":"ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d"} Apr 16 10:08:28.343547 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.343234 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7854f46d94-4xbhc" Apr 16 10:08:28.343547 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.343250 2575 scope.go:117] "RemoveContainer" containerID="ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d" Apr 16 10:08:28.343547 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.343238 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7854f46d94-4xbhc" event={"ID":"43b6e083-34ec-4128-8bb7-abce70cd9138","Type":"ContainerDied","Data":"bf403412e1ee245496db3f9f111147de3287887db4c1182f1794a054d05de19d"} Apr 16 10:08:28.353104 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.353083 2575 scope.go:117] "RemoveContainer" containerID="ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d" Apr 16 10:08:28.353387 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:08:28.353361 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d\": container with ID starting with ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d not found: ID does not exist" containerID="ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d" Apr 16 10:08:28.353459 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.353394 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d"} err="failed to get container status \"ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d\": rpc error: code = NotFound desc = could not find container \"ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d\": container with ID starting with ffe11a6f916216daeff0c59e4c8eb700186621b9b91af600bf5286d624b7c18d not found: ID does not exist" Apr 16 10:08:28.364730 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.364700 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7854f46d94-4xbhc"] Apr 16 10:08:28.368004 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:28.367983 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7854f46d94-4xbhc"] Apr 16 10:08:29.732996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:29.732959 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b6e083-34ec-4128-8bb7-abce70cd9138" path="/var/lib/kubelet/pods/43b6e083-34ec-4128-8bb7-abce70cd9138/volumes" Apr 16 10:08:50.850960 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:50.850923 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 10:08:50.851570 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:50.851530 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="alertmanager" containerID="cri-o://c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306" gracePeriod=120 Apr 16 10:08:50.851723 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:50.851590 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy-metric" containerID="cri-o://b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e" gracePeriod=120 Apr 16 10:08:50.851723 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:50.851598 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy-web" containerID="cri-o://7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1" gracePeriod=120 Apr 16 10:08:50.851723 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:50.851640 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy" containerID="cri-o://6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91" gracePeriod=120 Apr 16 10:08:50.851723 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:50.851651 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="config-reloader" containerID="cri-o://195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc" gracePeriod=120 Apr 16 10:08:50.851723 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:50.851705 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="prom-label-proxy" containerID="cri-o://3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a" gracePeriod=120 Apr 16 10:08:51.416902 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:51.416869 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerID="3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a" exitCode=0 Apr 16 10:08:51.416902 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:51.416895 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerID="6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91" exitCode=0 Apr 16 10:08:51.416902 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:51.416902 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerID="195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc" exitCode=0 Apr 16 10:08:51.416902 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:51.416907 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerID="c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306" exitCode=0 Apr 16 10:08:51.417192 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:51.416948 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerDied","Data":"3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a"} Apr 16 10:08:51.417192 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:51.416984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerDied","Data":"6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91"} Apr 16 10:08:51.417192 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:51.416996 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerDied","Data":"195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc"} Apr 16 10:08:51.417192 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:51.417007 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerDied","Data":"c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306"} Apr 16 10:08:52.098409 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.098385 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.141209 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141148 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-metrics-client-ca\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141386 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141225 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-main-tls\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141386 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141259 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-out\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141386 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141309 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-main-db\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141386 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141343 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141386 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141376 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdlbd\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-kube-api-access-pdlbd\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141628 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141403 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141628 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141437 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-trusted-ca-bundle\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141628 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141471 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-web-config\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141628 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141509 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-tls-assets\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141628 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141543 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-web\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141628 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141559 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:52.141628 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141584 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-volume\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141628 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141623 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-cluster-tls-config\") pod \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\" (UID: \"d0a1bf25-90fd-407a-b6d7-c07125231ae6\") " Apr 16 10:08:52.141985 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.141875 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-metrics-client-ca\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.142251 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.142220 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:08:52.142841 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.142815 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:52.145499 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.145449 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:52.147208 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.146392 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:52.147208 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.146735 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:52.147208 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.146752 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-out" (OuterVolumeSpecName: "config-out") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:08:52.148695 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.148595 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:08:52.149071 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.149041 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-kube-api-access-pdlbd" (OuterVolumeSpecName: "kube-api-access-pdlbd") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "kube-api-access-pdlbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:08:52.150937 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.150895 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:52.151490 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.151460 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-volume" (OuterVolumeSpecName: "config-volume") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:52.156172 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.156126 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:52.164945 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.164914 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-web-config" (OuterVolumeSpecName: "web-config") pod "d0a1bf25-90fd-407a-b6d7-c07125231ae6" (UID: "d0a1bf25-90fd-407a-b6d7-c07125231ae6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:52.243187 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243089 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-main-tls\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243187 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243124 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-out\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243187 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243135 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-main-db\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243187 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243145 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243187 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243177 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pdlbd\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-kube-api-access-pdlbd\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243187 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243187 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243523 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243197 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a1bf25-90fd-407a-b6d7-c07125231ae6-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243523 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243206 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-web-config\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243523 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243215 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d0a1bf25-90fd-407a-b6d7-c07125231ae6-tls-assets\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243523 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243223 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243523 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243232 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-config-volume\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.243523 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.243240 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d0a1bf25-90fd-407a-b6d7-c07125231ae6-cluster-tls-config\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:08:52.422900 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.422865 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerID="b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e" exitCode=0 Apr 16 10:08:52.422900 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.422893 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerID="7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1" exitCode=0 Apr 16 10:08:52.423107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.422939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerDied","Data":"b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e"} Apr 16 10:08:52.423107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.422972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerDied","Data":"7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1"} Apr 16 10:08:52.423107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.422989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d0a1bf25-90fd-407a-b6d7-c07125231ae6","Type":"ContainerDied","Data":"e627abfffde50f963af8eef316da212a8138416bad07efbbc70740fde1ddb4f5"} Apr 16 10:08:52.423107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.422992 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.423107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.423007 2575 scope.go:117] "RemoveContainer" containerID="3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a" Apr 16 10:08:52.430146 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.430130 2575 scope.go:117] "RemoveContainer" containerID="b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e" Apr 16 10:08:52.436996 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.436975 2575 scope.go:117] "RemoveContainer" containerID="6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91" Apr 16 10:08:52.443640 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.443619 2575 scope.go:117] "RemoveContainer" containerID="7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1" Apr 16 10:08:52.450241 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.450222 2575 scope.go:117] "RemoveContainer" containerID="195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc" Apr 16 10:08:52.456635 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.456615 2575 scope.go:117] "RemoveContainer" containerID="c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306" Apr 16 10:08:52.460981 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.460957 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 10:08:52.463903 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.463888 2575 scope.go:117] "RemoveContainer" containerID="80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd" Apr 16 10:08:52.470753 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.470734 2575 scope.go:117] "RemoveContainer" containerID="3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a" Apr 16 10:08:52.470998 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:08:52.470976 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a\": container with ID starting with 3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a not found: ID does not exist" containerID="3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a" Apr 16 10:08:52.471067 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.471012 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a"} err="failed to get container status \"3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a\": rpc error: code = NotFound desc = could not find container \"3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a\": container with ID starting with 3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a not found: ID does not exist" Apr 16 10:08:52.471067 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.471043 2575 scope.go:117] "RemoveContainer" containerID="b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e" Apr 16 10:08:52.471345 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:08:52.471328 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e\": container with ID starting with b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e not found: ID does not exist" containerID="b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e" Apr 16 10:08:52.471389 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.471351 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e"} err="failed to get container status \"b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e\": rpc error: code = NotFound desc = could not find container \"b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e\": container with ID starting with b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e not found: ID does not exist" Apr 16 10:08:52.471389 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.471367 2575 scope.go:117] "RemoveContainer" containerID="6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91" Apr 16 10:08:52.471592 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:08:52.471575 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91\": container with ID starting with 6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91 not found: ID does not exist" containerID="6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91" Apr 16 10:08:52.471631 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.471597 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91"} err="failed to get container status \"6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91\": rpc error: code = NotFound desc = could not find container \"6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91\": container with ID starting with 6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91 not found: ID does not exist" Apr 16 10:08:52.471631 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.471612 2575 scope.go:117] "RemoveContainer" containerID="7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1" Apr 16 10:08:52.471841 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:08:52.471825 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1\": container with ID starting with 7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1 not found: ID does not exist" containerID="7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1" Apr 16 10:08:52.471901 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.471848 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1"} err="failed to get container status \"7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1\": rpc error: code = NotFound desc = could not find container \"7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1\": container with ID starting with 7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1 not found: ID does not exist" Apr 16 10:08:52.471901 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.471869 2575 scope.go:117] "RemoveContainer" containerID="195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc" Apr 16 10:08:52.472101 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:08:52.472084 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc\": container with ID starting with 195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc not found: ID does not exist" containerID="195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc" Apr 16 10:08:52.472138 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.472106 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc"} err="failed to get container status \"195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc\": rpc error: code = NotFound desc = could not find container \"195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc\": container with ID starting with 195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc not found: ID does not exist" Apr 16 10:08:52.472138 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.472121 2575 scope.go:117] "RemoveContainer" containerID="c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306" Apr 16 10:08:52.472403 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:08:52.472386 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306\": container with ID starting with c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306 not found: ID does not exist" containerID="c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306" Apr 16 10:08:52.472462 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.472407 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306"} err="failed to get container status \"c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306\": rpc error: code = NotFound desc = could not find container \"c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306\": container with ID starting with c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306 not found: ID does not exist" Apr 16 10:08:52.472462 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.472421 2575 scope.go:117] "RemoveContainer" containerID="80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd" Apr 16 10:08:52.472644 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:08:52.472627 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd\": container with ID starting with 80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd not found: ID does not exist" containerID="80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd" Apr 16 10:08:52.472683 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.472650 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd"} err="failed to get container status \"80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd\": rpc error: code = NotFound desc = could not find container \"80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd\": container with ID starting with 80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd not found: ID does not exist" Apr 16 10:08:52.472683 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.472665 2575 scope.go:117] "RemoveContainer" containerID="3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a" Apr 16 10:08:52.472883 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.472862 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a"} err="failed to get container status \"3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a\": rpc error: code = NotFound desc = could not find container \"3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a\": container with ID starting with 3743e00e8105f485f1183f638d3a55e5d55c7c854f363d118d21f392143f561a not found: ID does not exist" Apr 16 10:08:52.472922 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.472884 2575 scope.go:117] "RemoveContainer" containerID="b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e" Apr 16 10:08:52.473084 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473067 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e"} err="failed to get container status \"b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e\": rpc error: code = NotFound desc = could not find container \"b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e\": container with ID starting with b62794321027c1abe8045f6dfa411da1926491f98c69c9b4762d5a016ef1484e not found: ID does not exist" Apr 16 10:08:52.473084 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473083 2575 scope.go:117] "RemoveContainer" containerID="6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91" Apr 16 10:08:52.473311 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473292 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91"} err="failed to get container status \"6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91\": rpc error: code = NotFound desc = could not find container \"6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91\": container with ID starting with 6156145cf18f5073574ae2ac58f9ca56b0c59c9e9ab508e1ad523d006a984f91 not found: ID does not exist" Apr 16 10:08:52.473311 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473309 2575 scope.go:117] "RemoveContainer" containerID="7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1" Apr 16 10:08:52.473496 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473482 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1"} err="failed to get container status \"7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1\": rpc error: code = NotFound desc = could not find container \"7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1\": container with ID starting with 7eef41e2a29064073cd89a0efa6381bc497a3aa519016828d8c6d9bc05398cc1 not found: ID does not exist" Apr 16 10:08:52.473540 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473497 2575 scope.go:117] "RemoveContainer" containerID="195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc" Apr 16 10:08:52.473690 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473669 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc"} err="failed to get container status \"195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc\": rpc error: code = NotFound desc = could not find container \"195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc\": container with ID starting with 195543f61ef2858636372f4da0fa8e07d843de3d4dfd696eff89a13814c7a6cc not found: ID does not exist" Apr 16 10:08:52.473766 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473692 2575 scope.go:117] "RemoveContainer" containerID="c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306" Apr 16 10:08:52.473893 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473879 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306"} err="failed to get container status \"c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306\": rpc error: code = NotFound desc = could not find container \"c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306\": container with ID starting with c7e3397e4ee05f3a31f4166eb3ef8b83326ae172e22b52da7fe9887f08f1c306 not found: ID does not exist" Apr 16 10:08:52.473947 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.473892 2575 scope.go:117] "RemoveContainer" containerID="80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd" Apr 16 10:08:52.474061 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.474045 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd"} err="failed to get container status \"80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd\": rpc error: code = NotFound desc = could not find container \"80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd\": container with ID starting with 80d0a6e9203e7d636c304289c3e55218bdd6bcc1a84dc6d49bdd8725bf64b1fd not found: ID does not exist" Apr 16 10:08:52.486610 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.486583 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 10:08:52.535204 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535107 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 10:08:52.535479 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535460 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy" Apr 16 10:08:52.535479 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535481 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535496 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy-metric" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535502 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy-metric" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535516 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43b6e083-34ec-4128-8bb7-abce70cd9138" containerName="console" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535521 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b6e083-34ec-4128-8bb7-abce70cd9138" containerName="console" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535529 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="config-reloader" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535534 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="config-reloader" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535543 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="prom-label-proxy" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535550 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="prom-label-proxy" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535556 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="init-config-reloader" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535562 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="init-config-reloader" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535570 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="alertmanager" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535575 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="alertmanager" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535580 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy-web" Apr 16 10:08:52.535602 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535587 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy-web" Apr 16 10:08:52.535990 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535639 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="config-reloader" Apr 16 10:08:52.535990 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535647 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="prom-label-proxy" Apr 16 10:08:52.535990 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535652 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy-metric" Apr 16 10:08:52.535990 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535660 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy-web" Apr 16 10:08:52.535990 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535667 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="alertmanager" Apr 16 10:08:52.535990 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535674 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" containerName="kube-rbac-proxy" Apr 16 10:08:52.535990 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.535680 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="43b6e083-34ec-4128-8bb7-abce70cd9138" containerName="console" Apr 16 10:08:52.538407 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.538391 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.541336 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.541311 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 10:08:52.541469 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.541453 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m5f6n\"" Apr 16 10:08:52.541632 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.541600 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 10:08:52.541632 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.541600 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 10:08:52.541632 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.541624 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 10:08:52.541812 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.541603 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 10:08:52.542004 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.541981 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 10:08:52.542217 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.542200 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 10:08:52.542265 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.542243 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 10:08:52.548560 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.548536 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 10:08:52.558295 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.558270 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 10:08:52.645294 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645489 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645489 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6lwp\" (UniqueName: \"kubernetes.io/projected/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-kube-api-access-m6lwp\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645489 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-config-volume\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645489 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645489 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-config-out\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645489 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645724 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645724 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645528 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645724 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645569 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645724 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645661 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645724 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-web-config\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.645873 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.645729 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.746580 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.746544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-config-volume\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.746580 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.746580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.746789 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.746603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-config-out\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.746829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.746804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.746874 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.746858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.746923 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.746889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.746974 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.746935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.747022 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.746984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.747077 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.747027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-web-config\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.747077 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.747060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.747222 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.747098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.747222 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.747183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.747222 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.747211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6lwp\" (UniqueName: \"kubernetes.io/projected/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-kube-api-access-m6lwp\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.747804 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.747780 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.747929 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.747810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.749530 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.749502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-config-out\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.749707 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.749685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.749846 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.749823 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-config-volume\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.749908 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.749876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.749908 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.749881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.750241 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.750220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.750347 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.750322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.750583 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.750563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-web-config\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.751046 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.751027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.751599 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.751576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.756306 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.756284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6lwp\" (UniqueName: \"kubernetes.io/projected/fa2b73b5-5c87-4838-92f1-ee572b2bdc8f-kube-api-access-m6lwp\") pod \"alertmanager-main-0\" (UID: \"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:52.848290 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:52.848253 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 10:08:53.001574 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:08:53.001538 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2b73b5_5c87_4838_92f1_ee572b2bdc8f.slice/crio-61351a5e58790a12f3dae070ac05eed5a08d34da4a6e1bfc8cc912bc594fcd59 WatchSource:0}: Error finding container 61351a5e58790a12f3dae070ac05eed5a08d34da4a6e1bfc8cc912bc594fcd59: Status 404 returned error can't find the container with id 61351a5e58790a12f3dae070ac05eed5a08d34da4a6e1bfc8cc912bc594fcd59 Apr 16 10:08:53.002342 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:53.002317 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 10:08:53.427405 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:53.427365 2575 generic.go:358] "Generic (PLEG): container finished" podID="fa2b73b5-5c87-4838-92f1-ee572b2bdc8f" containerID="bf22d90ffa7143c9b096e0d9b538ecd34c4fec334870daa600dadca50105dc13" exitCode=0 Apr 16 10:08:53.427829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:53.427411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f","Type":"ContainerDied","Data":"bf22d90ffa7143c9b096e0d9b538ecd34c4fec334870daa600dadca50105dc13"} Apr 16 10:08:53.427829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:53.427451 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f","Type":"ContainerStarted","Data":"61351a5e58790a12f3dae070ac05eed5a08d34da4a6e1bfc8cc912bc594fcd59"} Apr 16 10:08:53.733417 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:53.733384 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a1bf25-90fd-407a-b6d7-c07125231ae6" path="/var/lib/kubelet/pods/d0a1bf25-90fd-407a-b6d7-c07125231ae6/volumes" Apr 16 10:08:54.434359 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:54.434314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f","Type":"ContainerStarted","Data":"7f720892e720bc09ad161b24c571909a08f96d0376309094ba2685838e8234e4"} Apr 16 10:08:54.434359 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:54.434363 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f","Type":"ContainerStarted","Data":"b9902db3e8aefca6199f4e2458cb1f8a589443bf478f7eb319f5b1b1ac85f5a5"} Apr 16 10:08:54.434359 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:54.434374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f","Type":"ContainerStarted","Data":"658032cb08d7e85d6995df27558a1a5b35d6f99d46dcc2a8bc3b0f13c1b7b56b"} Apr 16 10:08:54.434798 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:54.434382 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f","Type":"ContainerStarted","Data":"d364b94261dde92f325e8cb2a3d2ebe5f10d86c41e9bacfe8f7586763fc04b38"} Apr 16 10:08:54.434798 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:54.434393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f","Type":"ContainerStarted","Data":"5b531e990e426b821f8cb2598abd7b546fca703975dcce05dbb2636b120711a1"} Apr 16 10:08:54.434798 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:54.434402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fa2b73b5-5c87-4838-92f1-ee572b2bdc8f","Type":"ContainerStarted","Data":"6667ed64f40a228467259df29b1c4f5a1391b33fb498cfb3f3e1bf8ce2551617"} Apr 16 10:08:54.470228 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:08:54.470176 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.470143415 podStartE2EDuration="2.470143415s" podCreationTimestamp="2026-04-16 10:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:08:54.468792659 +0000 UTC m=+209.305395841" watchObservedRunningTime="2026-04-16 10:08:54.470143415 +0000 UTC m=+209.306746586" Apr 16 10:09:02.755977 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.755884 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69fbc857dd-27zbk"] Apr 16 10:09:02.758839 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.758820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.777171 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.777122 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69fbc857dd-27zbk"] Apr 16 10:09:02.833484 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.833454 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcwd4\" (UniqueName: \"kubernetes.io/projected/046451eb-ee97-4561-9a0e-e1c3bf6be268-kube-api-access-jcwd4\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.833683 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.833503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-serving-cert\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.833683 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.833571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-config\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.833683 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.833625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-service-ca\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.833837 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.833687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-trusted-ca-bundle\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.833837 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.833710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-oauth-serving-cert\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.833837 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.833744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-oauth-config\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.934778 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.934738 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-service-ca\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.934968 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.934801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-trusted-ca-bundle\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.934968 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.934822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-oauth-serving-cert\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.934968 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.934842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-oauth-config\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.934968 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.934868 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcwd4\" (UniqueName: \"kubernetes.io/projected/046451eb-ee97-4561-9a0e-e1c3bf6be268-kube-api-access-jcwd4\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.934968 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.934891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-serving-cert\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.935263 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.935063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-config\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.935638 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.935608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-service-ca\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.935767 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.935661 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-oauth-serving-cert\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.935849 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.935830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-config\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.936061 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.936039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-trusted-ca-bundle\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.937521 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.937483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-oauth-config\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.937521 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.937513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-serving-cert\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:02.951040 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:02.951016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcwd4\" (UniqueName: \"kubernetes.io/projected/046451eb-ee97-4561-9a0e-e1c3bf6be268-kube-api-access-jcwd4\") pod \"console-69fbc857dd-27zbk\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:03.068203 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:03.068130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:03.198653 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:03.198618 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69fbc857dd-27zbk"] Apr 16 10:09:03.202037 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:09:03.202006 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046451eb_ee97_4561_9a0e_e1c3bf6be268.slice/crio-8c3cc96ef0b493845a6de02ebf9c2dbb715cd8e05bd6c2228a551dd1b3dcbcaa WatchSource:0}: Error finding container 8c3cc96ef0b493845a6de02ebf9c2dbb715cd8e05bd6c2228a551dd1b3dcbcaa: Status 404 returned error can't find the container with id 8c3cc96ef0b493845a6de02ebf9c2dbb715cd8e05bd6c2228a551dd1b3dcbcaa Apr 16 10:09:03.467222 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:03.467131 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69fbc857dd-27zbk" event={"ID":"046451eb-ee97-4561-9a0e-e1c3bf6be268","Type":"ContainerStarted","Data":"0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f"} Apr 16 10:09:03.467222 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:03.467186 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69fbc857dd-27zbk" event={"ID":"046451eb-ee97-4561-9a0e-e1c3bf6be268","Type":"ContainerStarted","Data":"8c3cc96ef0b493845a6de02ebf9c2dbb715cd8e05bd6c2228a551dd1b3dcbcaa"} Apr 16 10:09:03.492941 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:03.492895 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69fbc857dd-27zbk" podStartSLOduration=1.492879266 podStartE2EDuration="1.492879266s" podCreationTimestamp="2026-04-16 10:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:09:03.49142846 +0000 UTC m=+218.328031632" watchObservedRunningTime="2026-04-16 10:09:03.492879266 +0000 UTC m=+218.329482436" Apr 16 10:09:13.068692 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:13.068644 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:13.069106 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:13.068953 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:13.073597 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:13.073575 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:13.502284 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:13.502203 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:09:13.548496 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:13.548464 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-578b6f7475-vjjmq"] Apr 16 10:09:24.817895 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:24.817864 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cgvfp"] Apr 16 10:09:24.820302 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:24.820285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:24.822642 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:24.822621 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 10:09:24.828737 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:24.828712 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cgvfp"] Apr 16 10:09:24.916365 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:24.916310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/db3c6315-5043-4081-9649-b6ca08d1fb33-kubelet-config\") pod \"global-pull-secret-syncer-cgvfp\" (UID: \"db3c6315-5043-4081-9649-b6ca08d1fb33\") " pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:24.916558 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:24.916412 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/db3c6315-5043-4081-9649-b6ca08d1fb33-original-pull-secret\") pod \"global-pull-secret-syncer-cgvfp\" (UID: \"db3c6315-5043-4081-9649-b6ca08d1fb33\") " pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:24.916558 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:24.916464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/db3c6315-5043-4081-9649-b6ca08d1fb33-dbus\") pod \"global-pull-secret-syncer-cgvfp\" (UID: \"db3c6315-5043-4081-9649-b6ca08d1fb33\") " pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:25.016915 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:25.016881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/db3c6315-5043-4081-9649-b6ca08d1fb33-kubelet-config\") pod \"global-pull-secret-syncer-cgvfp\" (UID: \"db3c6315-5043-4081-9649-b6ca08d1fb33\") " pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:25.017108 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:25.016944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/db3c6315-5043-4081-9649-b6ca08d1fb33-original-pull-secret\") pod \"global-pull-secret-syncer-cgvfp\" (UID: \"db3c6315-5043-4081-9649-b6ca08d1fb33\") " pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:25.017108 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:25.016984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/db3c6315-5043-4081-9649-b6ca08d1fb33-dbus\") pod \"global-pull-secret-syncer-cgvfp\" (UID: \"db3c6315-5043-4081-9649-b6ca08d1fb33\") " pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:25.017108 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:25.017010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/db3c6315-5043-4081-9649-b6ca08d1fb33-kubelet-config\") pod \"global-pull-secret-syncer-cgvfp\" (UID: \"db3c6315-5043-4081-9649-b6ca08d1fb33\") " pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:25.017302 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:25.017213 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/db3c6315-5043-4081-9649-b6ca08d1fb33-dbus\") pod \"global-pull-secret-syncer-cgvfp\" (UID: \"db3c6315-5043-4081-9649-b6ca08d1fb33\") " pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:25.019377 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:25.019355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/db3c6315-5043-4081-9649-b6ca08d1fb33-original-pull-secret\") pod \"global-pull-secret-syncer-cgvfp\" (UID: \"db3c6315-5043-4081-9649-b6ca08d1fb33\") " pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:25.131026 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:25.130934 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cgvfp" Apr 16 10:09:25.250282 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:25.250256 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cgvfp"] Apr 16 10:09:25.253317 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:09:25.253288 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb3c6315_5043_4081_9649_b6ca08d1fb33.slice/crio-741ea7324c5b595774111fc75152efffa37f45189494c259bd120e49e9c185b7 WatchSource:0}: Error finding container 741ea7324c5b595774111fc75152efffa37f45189494c259bd120e49e9c185b7: Status 404 returned error can't find the container with id 741ea7324c5b595774111fc75152efffa37f45189494c259bd120e49e9c185b7 Apr 16 10:09:25.537571 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:25.537479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cgvfp" event={"ID":"db3c6315-5043-4081-9649-b6ca08d1fb33","Type":"ContainerStarted","Data":"741ea7324c5b595774111fc75152efffa37f45189494c259bd120e49e9c185b7"} Apr 16 10:09:29.552172 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:29.552125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cgvfp" event={"ID":"db3c6315-5043-4081-9649-b6ca08d1fb33","Type":"ContainerStarted","Data":"876275cdcda97b272d8dbe639c9ed4635122cfcd2e7d9b40c8111c54bde50c5e"} Apr 16 10:09:29.573818 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:29.573766 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cgvfp" podStartSLOduration=1.402253407 podStartE2EDuration="5.57374925s" podCreationTimestamp="2026-04-16 10:09:24 +0000 UTC" firstStartedPulling="2026-04-16 10:09:25.255347058 +0000 UTC m=+240.091950211" lastFinishedPulling="2026-04-16 10:09:29.426842901 +0000 UTC m=+244.263446054" observedRunningTime="2026-04-16 10:09:29.572853272 +0000 UTC m=+244.409456444" watchObservedRunningTime="2026-04-16 10:09:29.57374925 +0000 UTC m=+244.410352422" Apr 16 10:09:38.572190 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.572065 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-578b6f7475-vjjmq" podUID="7ee30a63-6480-4107-8a75-b94d7143af5b" containerName="console" containerID="cri-o://c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2" gracePeriod=15 Apr 16 10:09:38.820819 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.820791 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578b6f7475-vjjmq_7ee30a63-6480-4107-8a75-b94d7143af5b/console/0.log" Apr 16 10:09:38.820966 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.820867 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:09:38.935726 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.935635 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-serving-cert\") pod \"7ee30a63-6480-4107-8a75-b94d7143af5b\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " Apr 16 10:09:38.935726 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.935692 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-trusted-ca-bundle\") pod \"7ee30a63-6480-4107-8a75-b94d7143af5b\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " Apr 16 10:09:38.935962 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.935732 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2lgc\" (UniqueName: \"kubernetes.io/projected/7ee30a63-6480-4107-8a75-b94d7143af5b-kube-api-access-n2lgc\") pod \"7ee30a63-6480-4107-8a75-b94d7143af5b\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " Apr 16 10:09:38.935962 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.935753 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-console-config\") pod \"7ee30a63-6480-4107-8a75-b94d7143af5b\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " Apr 16 10:09:38.935962 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.935786 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-oauth-config\") pod \"7ee30a63-6480-4107-8a75-b94d7143af5b\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " Apr 16 10:09:38.935962 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.935832 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-service-ca\") pod \"7ee30a63-6480-4107-8a75-b94d7143af5b\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " Apr 16 10:09:38.935962 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.935888 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-oauth-serving-cert\") pod \"7ee30a63-6480-4107-8a75-b94d7143af5b\" (UID: \"7ee30a63-6480-4107-8a75-b94d7143af5b\") " Apr 16 10:09:38.936296 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.936266 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-console-config" (OuterVolumeSpecName: "console-config") pod "7ee30a63-6480-4107-8a75-b94d7143af5b" (UID: "7ee30a63-6480-4107-8a75-b94d7143af5b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:09:38.936373 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.936306 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7ee30a63-6480-4107-8a75-b94d7143af5b" (UID: "7ee30a63-6480-4107-8a75-b94d7143af5b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:09:38.936467 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.936437 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-service-ca" (OuterVolumeSpecName: "service-ca") pod "7ee30a63-6480-4107-8a75-b94d7143af5b" (UID: "7ee30a63-6480-4107-8a75-b94d7143af5b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:09:38.936554 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.936526 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7ee30a63-6480-4107-8a75-b94d7143af5b" (UID: "7ee30a63-6480-4107-8a75-b94d7143af5b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:09:38.938123 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.938102 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee30a63-6480-4107-8a75-b94d7143af5b-kube-api-access-n2lgc" (OuterVolumeSpecName: "kube-api-access-n2lgc") pod "7ee30a63-6480-4107-8a75-b94d7143af5b" (UID: "7ee30a63-6480-4107-8a75-b94d7143af5b"). InnerVolumeSpecName "kube-api-access-n2lgc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:09:38.938428 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.938396 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7ee30a63-6480-4107-8a75-b94d7143af5b" (UID: "7ee30a63-6480-4107-8a75-b94d7143af5b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:09:38.938528 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:38.938505 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7ee30a63-6480-4107-8a75-b94d7143af5b" (UID: "7ee30a63-6480-4107-8a75-b94d7143af5b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:09:39.036687 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.036636 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-serving-cert\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:09:39.036687 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.036689 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-trusted-ca-bundle\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:09:39.036885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.036700 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2lgc\" (UniqueName: \"kubernetes.io/projected/7ee30a63-6480-4107-8a75-b94d7143af5b-kube-api-access-n2lgc\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:09:39.036885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.036714 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-console-config\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:09:39.036885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.036722 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ee30a63-6480-4107-8a75-b94d7143af5b-console-oauth-config\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:09:39.036885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.036731 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-service-ca\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:09:39.036885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.036741 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ee30a63-6480-4107-8a75-b94d7143af5b-oauth-serving-cert\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:09:39.582087 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.582058 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578b6f7475-vjjmq_7ee30a63-6480-4107-8a75-b94d7143af5b/console/0.log" Apr 16 10:09:39.582476 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.582102 2575 generic.go:358] "Generic (PLEG): container finished" podID="7ee30a63-6480-4107-8a75-b94d7143af5b" containerID="c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2" exitCode=2 Apr 16 10:09:39.582476 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.582191 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578b6f7475-vjjmq" Apr 16 10:09:39.582476 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.582189 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578b6f7475-vjjmq" event={"ID":"7ee30a63-6480-4107-8a75-b94d7143af5b","Type":"ContainerDied","Data":"c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2"} Apr 16 10:09:39.582476 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.582295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578b6f7475-vjjmq" event={"ID":"7ee30a63-6480-4107-8a75-b94d7143af5b","Type":"ContainerDied","Data":"b7dc375e8c4a4feced8173b80d90dfeac51010194f57403f9791c4d0d58d9115"} Apr 16 10:09:39.582476 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.582315 2575 scope.go:117] "RemoveContainer" containerID="c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2" Apr 16 10:09:39.590629 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.590613 2575 scope.go:117] "RemoveContainer" containerID="c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2" Apr 16 10:09:39.590908 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:09:39.590887 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2\": container with ID starting with c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2 not found: ID does not exist" containerID="c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2" Apr 16 10:09:39.590957 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.590918 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2"} err="failed to get container status \"c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2\": rpc error: code = NotFound desc = could not find container \"c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2\": container with ID starting with c832b2caf40ce54e49499254a9601fa1a7dfc1db696b47b7d424c75e0c3201b2 not found: ID does not exist" Apr 16 10:09:39.602882 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.602837 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-578b6f7475-vjjmq"] Apr 16 10:09:39.604311 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.604290 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-578b6f7475-vjjmq"] Apr 16 10:09:39.734939 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:39.734906 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee30a63-6480-4107-8a75-b94d7143af5b" path="/var/lib/kubelet/pods/7ee30a63-6480-4107-8a75-b94d7143af5b/volumes" Apr 16 10:09:54.377329 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.377295 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg"] Apr 16 10:09:54.377797 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.377769 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ee30a63-6480-4107-8a75-b94d7143af5b" containerName="console" Apr 16 10:09:54.377797 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.377787 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee30a63-6480-4107-8a75-b94d7143af5b" containerName="console" Apr 16 10:09:54.377913 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.377877 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ee30a63-6480-4107-8a75-b94d7143af5b" containerName="console" Apr 16 10:09:54.380652 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.380631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.383287 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.383259 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 10:09:54.384073 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.384049 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 10:09:54.384196 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.384052 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zq6mv\"" Apr 16 10:09:54.390118 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.390094 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg"] Apr 16 10:09:54.463745 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.463705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.463920 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.463771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.463920 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.463893 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvt5p\" (UniqueName: \"kubernetes.io/projected/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-kube-api-access-kvt5p\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.564690 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.564651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvt5p\" (UniqueName: \"kubernetes.io/projected/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-kube-api-access-kvt5p\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.564881 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.564728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.564881 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.564779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.565231 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.565213 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.565286 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.565209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.573224 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.573193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvt5p\" (UniqueName: \"kubernetes.io/projected/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-kube-api-access-kvt5p\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.691364 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.691264 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:09:54.812121 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:54.812095 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg"] Apr 16 10:09:55.632753 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:09:55.632694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" event={"ID":"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a","Type":"ContainerStarted","Data":"ee2a01e9253667db780669872fb92df13e7ad0b5331078cab56703b82152ea49"} Apr 16 10:10:00.653301 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:00.653262 2575 generic.go:358] "Generic (PLEG): container finished" podID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerID="43673af0eccbc5f645573b8894ace0429eed60069bd2fa194e1cb38fb1e58543" exitCode=0 Apr 16 10:10:00.653754 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:00.653347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" event={"ID":"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a","Type":"ContainerDied","Data":"43673af0eccbc5f645573b8894ace0429eed60069bd2fa194e1cb38fb1e58543"} Apr 16 10:10:02.661740 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:02.661656 2575 generic.go:358] "Generic (PLEG): container finished" podID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerID="a8087414de649b8b4d7a22a38309f624d685375d6ccc69dfc2716a46f4a454f8" exitCode=0 Apr 16 10:10:02.661740 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:02.661722 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" event={"ID":"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a","Type":"ContainerDied","Data":"a8087414de649b8b4d7a22a38309f624d685375d6ccc69dfc2716a46f4a454f8"} Apr 16 10:10:08.688553 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:08.688515 2575 generic.go:358] "Generic (PLEG): container finished" podID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerID="283d84230342e5f9ae2d24bf600d995cdb13fc95b6ebf37f1ed133080b83ed95" exitCode=0 Apr 16 10:10:08.688963 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:08.688562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" event={"ID":"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a","Type":"ContainerDied","Data":"283d84230342e5f9ae2d24bf600d995cdb13fc95b6ebf37f1ed133080b83ed95"} Apr 16 10:10:09.819511 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:09.819485 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:10:10.003085 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.002987 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-util\") pod \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " Apr 16 10:10:10.003085 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.003078 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvt5p\" (UniqueName: \"kubernetes.io/projected/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-kube-api-access-kvt5p\") pod \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " Apr 16 10:10:10.003306 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.003177 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-bundle\") pod \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\" (UID: \"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a\") " Apr 16 10:10:10.003811 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.003774 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-bundle" (OuterVolumeSpecName: "bundle") pod "554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" (UID: "554cfdf6-b1d0-4a5b-82b3-b5de3f61365a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:10:10.005454 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.005416 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-kube-api-access-kvt5p" (OuterVolumeSpecName: "kube-api-access-kvt5p") pod "554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" (UID: "554cfdf6-b1d0-4a5b-82b3-b5de3f61365a"). InnerVolumeSpecName "kube-api-access-kvt5p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:10:10.007212 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.007188 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-util" (OuterVolumeSpecName: "util") pod "554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" (UID: "554cfdf6-b1d0-4a5b-82b3-b5de3f61365a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:10:10.104882 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.104837 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvt5p\" (UniqueName: \"kubernetes.io/projected/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-kube-api-access-kvt5p\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:10:10.104882 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.104874 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-bundle\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:10:10.104882 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.104885 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/554cfdf6-b1d0-4a5b-82b3-b5de3f61365a-util\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:10:10.696918 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.696878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" event={"ID":"554cfdf6-b1d0-4a5b-82b3-b5de3f61365a","Type":"ContainerDied","Data":"ee2a01e9253667db780669872fb92df13e7ad0b5331078cab56703b82152ea49"} Apr 16 10:10:10.696918 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.696917 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee2a01e9253667db780669872fb92df13e7ad0b5331078cab56703b82152ea49" Apr 16 10:10:10.697129 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:10.696962 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52m7xg" Apr 16 10:10:16.916134 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.916103 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl"] Apr 16 10:10:16.916526 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.916432 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerName="pull" Apr 16 10:10:16.916526 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.916444 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerName="pull" Apr 16 10:10:16.916526 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.916460 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerName="extract" Apr 16 10:10:16.916526 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.916465 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerName="extract" Apr 16 10:10:16.916526 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.916478 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerName="util" Apr 16 10:10:16.916526 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.916484 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerName="util" Apr 16 10:10:16.916708 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.916532 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="554cfdf6-b1d0-4a5b-82b3-b5de3f61365a" containerName="extract" Apr 16 10:10:16.920534 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.920518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" Apr 16 10:10:16.922822 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.922801 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 10:10:16.922927 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.922820 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-v7pwg\"" Apr 16 10:10:16.922927 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.922908 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:10:16.930431 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.930407 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl"] Apr 16 10:10:16.957530 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.957502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4xt5\" (UniqueName: \"kubernetes.io/projected/ed7b624d-cfd8-447c-b769-41657d815e59-kube-api-access-s4xt5\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-hzczl\" (UID: \"ed7b624d-cfd8-447c-b769-41657d815e59\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" Apr 16 10:10:16.957683 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:16.957562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed7b624d-cfd8-447c-b769-41657d815e59-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-hzczl\" (UID: \"ed7b624d-cfd8-447c-b769-41657d815e59\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" Apr 16 10:10:17.058541 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:17.058506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xt5\" (UniqueName: \"kubernetes.io/projected/ed7b624d-cfd8-447c-b769-41657d815e59-kube-api-access-s4xt5\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-hzczl\" (UID: \"ed7b624d-cfd8-447c-b769-41657d815e59\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" Apr 16 10:10:17.058731 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:17.058564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed7b624d-cfd8-447c-b769-41657d815e59-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-hzczl\" (UID: \"ed7b624d-cfd8-447c-b769-41657d815e59\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" Apr 16 10:10:17.058900 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:17.058882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed7b624d-cfd8-447c-b769-41657d815e59-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-hzczl\" (UID: \"ed7b624d-cfd8-447c-b769-41657d815e59\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" Apr 16 10:10:17.067491 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:17.067469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4xt5\" (UniqueName: \"kubernetes.io/projected/ed7b624d-cfd8-447c-b769-41657d815e59-kube-api-access-s4xt5\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-hzczl\" (UID: \"ed7b624d-cfd8-447c-b769-41657d815e59\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" Apr 16 10:10:17.230605 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:17.230509 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" Apr 16 10:10:17.352620 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:17.352538 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl"] Apr 16 10:10:17.355726 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:10:17.355698 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded7b624d_cfd8_447c_b769_41657d815e59.slice/crio-0a53f07e458d426128ddbfd5c7d53b7ec57a06cae62f8a8d75e9c231cc6024e6 WatchSource:0}: Error finding container 0a53f07e458d426128ddbfd5c7d53b7ec57a06cae62f8a8d75e9c231cc6024e6: Status 404 returned error can't find the container with id 0a53f07e458d426128ddbfd5c7d53b7ec57a06cae62f8a8d75e9c231cc6024e6 Apr 16 10:10:17.718623 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:17.718582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" event={"ID":"ed7b624d-cfd8-447c-b769-41657d815e59","Type":"ContainerStarted","Data":"0a53f07e458d426128ddbfd5c7d53b7ec57a06cae62f8a8d75e9c231cc6024e6"} Apr 16 10:10:19.727570 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:19.727528 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" event={"ID":"ed7b624d-cfd8-447c-b769-41657d815e59","Type":"ContainerStarted","Data":"e24ab46b2fd5f6d18f3fa64084dad6d3e6f410fed21aa1061174c89ee990c195"} Apr 16 10:10:19.747206 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:19.747134 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-hzczl" podStartSLOduration=2.244717737 podStartE2EDuration="3.747120831s" podCreationTimestamp="2026-04-16 10:10:16 +0000 UTC" firstStartedPulling="2026-04-16 10:10:17.358173641 +0000 UTC m=+292.194776791" lastFinishedPulling="2026-04-16 10:10:18.860576735 +0000 UTC m=+293.697179885" observedRunningTime="2026-04-16 10:10:19.745383755 +0000 UTC m=+294.581986926" watchObservedRunningTime="2026-04-16 10:10:19.747120831 +0000 UTC m=+294.583724001" Apr 16 10:10:25.623446 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:25.623418 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 10:10:40.828068 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:40.828033 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-kkz6p"] Apr 16 10:10:40.862183 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:40.862136 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-kkz6p"] Apr 16 10:10:40.862321 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:40.862275 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-kkz6p" Apr 16 10:10:40.865193 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:40.865172 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 10:10:40.865307 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:40.865173 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 10:10:40.874061 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:40.874039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-fphkn\"" Apr 16 10:10:40.969212 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:40.969139 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh966\" (UniqueName: \"kubernetes.io/projected/caf1b61c-ca01-4e78-aeb8-76bb85413d0e-kube-api-access-nh966\") pod \"cert-manager-759f64656b-kkz6p\" (UID: \"caf1b61c-ca01-4e78-aeb8-76bb85413d0e\") " pod="cert-manager/cert-manager-759f64656b-kkz6p" Apr 16 10:10:40.969388 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:40.969328 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/caf1b61c-ca01-4e78-aeb8-76bb85413d0e-bound-sa-token\") pod \"cert-manager-759f64656b-kkz6p\" (UID: \"caf1b61c-ca01-4e78-aeb8-76bb85413d0e\") " pod="cert-manager/cert-manager-759f64656b-kkz6p" Apr 16 10:10:41.070254 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:41.070208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/caf1b61c-ca01-4e78-aeb8-76bb85413d0e-bound-sa-token\") pod \"cert-manager-759f64656b-kkz6p\" (UID: \"caf1b61c-ca01-4e78-aeb8-76bb85413d0e\") " pod="cert-manager/cert-manager-759f64656b-kkz6p" Apr 16 10:10:41.070433 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:41.070296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh966\" (UniqueName: \"kubernetes.io/projected/caf1b61c-ca01-4e78-aeb8-76bb85413d0e-kube-api-access-nh966\") pod \"cert-manager-759f64656b-kkz6p\" (UID: \"caf1b61c-ca01-4e78-aeb8-76bb85413d0e\") " pod="cert-manager/cert-manager-759f64656b-kkz6p" Apr 16 10:10:41.078338 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:41.078265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/caf1b61c-ca01-4e78-aeb8-76bb85413d0e-bound-sa-token\") pod \"cert-manager-759f64656b-kkz6p\" (UID: \"caf1b61c-ca01-4e78-aeb8-76bb85413d0e\") " pod="cert-manager/cert-manager-759f64656b-kkz6p" Apr 16 10:10:41.078456 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:41.078380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh966\" (UniqueName: \"kubernetes.io/projected/caf1b61c-ca01-4e78-aeb8-76bb85413d0e-kube-api-access-nh966\") pod \"cert-manager-759f64656b-kkz6p\" (UID: \"caf1b61c-ca01-4e78-aeb8-76bb85413d0e\") " pod="cert-manager/cert-manager-759f64656b-kkz6p" Apr 16 10:10:41.184422 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:41.184363 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-kkz6p" Apr 16 10:10:41.321564 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:41.321538 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-kkz6p"] Apr 16 10:10:41.324080 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:10:41.324048 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaf1b61c_ca01_4e78_aeb8_76bb85413d0e.slice/crio-49ca6cc692031be7c69686fbffa2481b0db913387001423b5fc557c5755ff920 WatchSource:0}: Error finding container 49ca6cc692031be7c69686fbffa2481b0db913387001423b5fc557c5755ff920: Status 404 returned error can't find the container with id 49ca6cc692031be7c69686fbffa2481b0db913387001423b5fc557c5755ff920 Apr 16 10:10:41.326467 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:41.326452 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:10:41.797290 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:41.797197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-kkz6p" event={"ID":"caf1b61c-ca01-4e78-aeb8-76bb85413d0e","Type":"ContainerStarted","Data":"49ca6cc692031be7c69686fbffa2481b0db913387001423b5fc557c5755ff920"} Apr 16 10:10:44.808366 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:44.808323 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-kkz6p" event={"ID":"caf1b61c-ca01-4e78-aeb8-76bb85413d0e","Type":"ContainerStarted","Data":"6e5d86894ac99f5f6aa4e468c99a232a9c8c0f454eb6bb02b90b32469d0dffc4"} Apr 16 10:10:44.824442 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:44.824390 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-kkz6p" podStartSLOduration=2.331695038 podStartE2EDuration="4.824373194s" podCreationTimestamp="2026-04-16 10:10:40 +0000 UTC" firstStartedPulling="2026-04-16 10:10:41.326588876 +0000 UTC m=+316.163192024" lastFinishedPulling="2026-04-16 10:10:43.819267018 +0000 UTC m=+318.655870180" observedRunningTime="2026-04-16 10:10:44.823775396 +0000 UTC m=+319.660378566" watchObservedRunningTime="2026-04-16 10:10:44.824373194 +0000 UTC m=+319.660976364" Apr 16 10:10:45.159011 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.158974 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv"] Apr 16 10:10:45.162776 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.162755 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.165517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.165491 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 10:10:45.165517 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.165513 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 10:10:45.165694 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.165499 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zq6mv\"" Apr 16 10:10:45.170554 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.170528 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv"] Apr 16 10:10:45.309167 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.309126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.309374 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.309203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7gn8\" (UniqueName: \"kubernetes.io/projected/dadd74ac-b6be-4492-85c5-0aac78bd6080-kube-api-access-c7gn8\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.309374 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.309268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.409856 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.409765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.409856 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.409811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7gn8\" (UniqueName: \"kubernetes.io/projected/dadd74ac-b6be-4492-85c5-0aac78bd6080-kube-api-access-c7gn8\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.410083 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.409932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.410223 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.410203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.410364 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.410346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.418360 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.418336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7gn8\" (UniqueName: \"kubernetes.io/projected/dadd74ac-b6be-4492-85c5-0aac78bd6080-kube-api-access-c7gn8\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.473343 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.473306 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:45.597244 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.597216 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv"] Apr 16 10:10:45.600067 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:10:45.600032 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddadd74ac_b6be_4492_85c5_0aac78bd6080.slice/crio-17df2ceab71f6e8a0ff2bbf1ae3e57b2329ee5c24b14ee90e8386fa64f0846cf WatchSource:0}: Error finding container 17df2ceab71f6e8a0ff2bbf1ae3e57b2329ee5c24b14ee90e8386fa64f0846cf: Status 404 returned error can't find the container with id 17df2ceab71f6e8a0ff2bbf1ae3e57b2329ee5c24b14ee90e8386fa64f0846cf Apr 16 10:10:45.813030 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.812995 2575 generic.go:358] "Generic (PLEG): container finished" podID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerID="f203beb66d97ad620a4f640b6a7f4a09a8d7e89a799b535564736bf1f32c65a6" exitCode=0 Apr 16 10:10:45.813454 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.813072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" event={"ID":"dadd74ac-b6be-4492-85c5-0aac78bd6080","Type":"ContainerDied","Data":"f203beb66d97ad620a4f640b6a7f4a09a8d7e89a799b535564736bf1f32c65a6"} Apr 16 10:10:45.813454 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:45.813105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" event={"ID":"dadd74ac-b6be-4492-85c5-0aac78bd6080","Type":"ContainerStarted","Data":"17df2ceab71f6e8a0ff2bbf1ae3e57b2329ee5c24b14ee90e8386fa64f0846cf"} Apr 16 10:10:48.826342 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:48.826306 2575 generic.go:358] "Generic (PLEG): container finished" podID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerID="b285297cd7e442d1abd200f57e67a5f958e1ef20a740358ef4b2b3291e84ffa1" exitCode=0 Apr 16 10:10:48.826761 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:48.826364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" event={"ID":"dadd74ac-b6be-4492-85c5-0aac78bd6080","Type":"ContainerDied","Data":"b285297cd7e442d1abd200f57e67a5f958e1ef20a740358ef4b2b3291e84ffa1"} Apr 16 10:10:49.832058 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:49.832023 2575 generic.go:358] "Generic (PLEG): container finished" podID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerID="e78337b455ed8a78d53cce84a2eb8106ffaaa155807b687513861351422f54bf" exitCode=0 Apr 16 10:10:49.832496 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:49.832116 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" event={"ID":"dadd74ac-b6be-4492-85c5-0aac78bd6080","Type":"ContainerDied","Data":"e78337b455ed8a78d53cce84a2eb8106ffaaa155807b687513861351422f54bf"} Apr 16 10:10:50.965969 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:50.965938 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:51.060147 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.060113 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-util\") pod \"dadd74ac-b6be-4492-85c5-0aac78bd6080\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " Apr 16 10:10:51.060368 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.060173 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7gn8\" (UniqueName: \"kubernetes.io/projected/dadd74ac-b6be-4492-85c5-0aac78bd6080-kube-api-access-c7gn8\") pod \"dadd74ac-b6be-4492-85c5-0aac78bd6080\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " Apr 16 10:10:51.060368 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.060198 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-bundle\") pod \"dadd74ac-b6be-4492-85c5-0aac78bd6080\" (UID: \"dadd74ac-b6be-4492-85c5-0aac78bd6080\") " Apr 16 10:10:51.060592 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.060556 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-bundle" (OuterVolumeSpecName: "bundle") pod "dadd74ac-b6be-4492-85c5-0aac78bd6080" (UID: "dadd74ac-b6be-4492-85c5-0aac78bd6080"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:10:51.062373 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.062343 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadd74ac-b6be-4492-85c5-0aac78bd6080-kube-api-access-c7gn8" (OuterVolumeSpecName: "kube-api-access-c7gn8") pod "dadd74ac-b6be-4492-85c5-0aac78bd6080" (UID: "dadd74ac-b6be-4492-85c5-0aac78bd6080"). InnerVolumeSpecName "kube-api-access-c7gn8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:10:51.064558 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.064512 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-util" (OuterVolumeSpecName: "util") pod "dadd74ac-b6be-4492-85c5-0aac78bd6080" (UID: "dadd74ac-b6be-4492-85c5-0aac78bd6080"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:10:51.161691 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.161605 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-util\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:10:51.161691 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.161636 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7gn8\" (UniqueName: \"kubernetes.io/projected/dadd74ac-b6be-4492-85c5-0aac78bd6080-kube-api-access-c7gn8\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:10:51.161691 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.161646 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dadd74ac-b6be-4492-85c5-0aac78bd6080-bundle\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:10:51.839876 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.839830 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" event={"ID":"dadd74ac-b6be-4492-85c5-0aac78bd6080","Type":"ContainerDied","Data":"17df2ceab71f6e8a0ff2bbf1ae3e57b2329ee5c24b14ee90e8386fa64f0846cf"} Apr 16 10:10:51.839876 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.839876 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17df2ceab71f6e8a0ff2bbf1ae3e57b2329ee5c24b14ee90e8386fa64f0846cf" Apr 16 10:10:51.840088 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:51.839887 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2hvqv" Apr 16 10:10:57.121356 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.121319 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8"] Apr 16 10:10:57.121739 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.121667 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerName="extract" Apr 16 10:10:57.121739 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.121678 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerName="extract" Apr 16 10:10:57.121739 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.121693 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerName="util" Apr 16 10:10:57.121739 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.121699 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerName="util" Apr 16 10:10:57.121739 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.121715 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerName="pull" Apr 16 10:10:57.121739 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.121722 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerName="pull" Apr 16 10:10:57.121925 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.121769 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dadd74ac-b6be-4492-85c5-0aac78bd6080" containerName="extract" Apr 16 10:10:57.125432 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.125409 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" Apr 16 10:10:57.127849 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.127827 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-lvz4t\"" Apr 16 10:10:57.128146 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.128129 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:10:57.128749 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.128735 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 16 10:10:57.132485 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.132463 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8"] Apr 16 10:10:57.214630 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.214587 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2bc1d90f-b78e-498a-b4ce-74f985771419-tmp\") pod \"jobset-operator-747c5859c7-2h7b8\" (UID: \"2bc1d90f-b78e-498a-b4ce-74f985771419\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" Apr 16 10:10:57.214862 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.214692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z7dd\" (UniqueName: \"kubernetes.io/projected/2bc1d90f-b78e-498a-b4ce-74f985771419-kube-api-access-7z7dd\") pod \"jobset-operator-747c5859c7-2h7b8\" (UID: \"2bc1d90f-b78e-498a-b4ce-74f985771419\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" Apr 16 10:10:57.315981 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.315939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2bc1d90f-b78e-498a-b4ce-74f985771419-tmp\") pod \"jobset-operator-747c5859c7-2h7b8\" (UID: \"2bc1d90f-b78e-498a-b4ce-74f985771419\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" Apr 16 10:10:57.316172 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.316003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z7dd\" (UniqueName: \"kubernetes.io/projected/2bc1d90f-b78e-498a-b4ce-74f985771419-kube-api-access-7z7dd\") pod \"jobset-operator-747c5859c7-2h7b8\" (UID: \"2bc1d90f-b78e-498a-b4ce-74f985771419\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" Apr 16 10:10:57.316364 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.316345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2bc1d90f-b78e-498a-b4ce-74f985771419-tmp\") pod \"jobset-operator-747c5859c7-2h7b8\" (UID: \"2bc1d90f-b78e-498a-b4ce-74f985771419\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" Apr 16 10:10:57.324935 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.324912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z7dd\" (UniqueName: \"kubernetes.io/projected/2bc1d90f-b78e-498a-b4ce-74f985771419-kube-api-access-7z7dd\") pod \"jobset-operator-747c5859c7-2h7b8\" (UID: \"2bc1d90f-b78e-498a-b4ce-74f985771419\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" Apr 16 10:10:57.436185 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.436074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" Apr 16 10:10:57.556892 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.556861 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8"] Apr 16 10:10:57.558742 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:10:57.558710 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bc1d90f_b78e_498a_b4ce_74f985771419.slice/crio-4d8f77ed21a0bc184a6bb509f656dbbce5c64fa9e2600e6194a63c9c122d0c25 WatchSource:0}: Error finding container 4d8f77ed21a0bc184a6bb509f656dbbce5c64fa9e2600e6194a63c9c122d0c25: Status 404 returned error can't find the container with id 4d8f77ed21a0bc184a6bb509f656dbbce5c64fa9e2600e6194a63c9c122d0c25 Apr 16 10:10:57.865359 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:10:57.865321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" event={"ID":"2bc1d90f-b78e-498a-b4ce-74f985771419","Type":"ContainerStarted","Data":"4d8f77ed21a0bc184a6bb509f656dbbce5c64fa9e2600e6194a63c9c122d0c25"} Apr 16 10:11:00.879516 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:11:00.879474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" event={"ID":"2bc1d90f-b78e-498a-b4ce-74f985771419","Type":"ContainerStarted","Data":"e20e407d26bd106f3486d6140c0e77aa31da2136825eff9fed65946117099c3a"} Apr 16 10:11:00.895123 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:11:00.895071 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-2h7b8" podStartSLOduration=1.45120226 podStartE2EDuration="3.895057657s" podCreationTimestamp="2026-04-16 10:10:57 +0000 UTC" firstStartedPulling="2026-04-16 10:10:57.560715995 +0000 UTC m=+332.397319144" lastFinishedPulling="2026-04-16 10:11:00.004571392 +0000 UTC m=+334.841174541" observedRunningTime="2026-04-16 10:11:00.893863601 +0000 UTC m=+335.730466785" watchObservedRunningTime="2026-04-16 10:11:00.895057657 +0000 UTC m=+335.731660828" Apr 16 10:13:18.443766 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.443728 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx"] Apr 16 10:13:18.446960 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.446943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" Apr 16 10:13:18.449342 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.449320 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-zrbm8\"/\"openshift-service-ca.crt\"" Apr 16 10:13:18.450383 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.450364 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-zrbm8\"/\"kube-root-ca.crt\"" Apr 16 10:13:18.450516 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.450370 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-zrbm8\"/\"default-dockercfg-ltmjg\"" Apr 16 10:13:18.460137 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.460115 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx"] Apr 16 10:13:18.540682 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.540644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vv4\" (UniqueName: \"kubernetes.io/projected/235732e9-5820-4c65-99c0-cec3885fa876-kube-api-access-78vv4\") pod \"test-trainjob-r845r-node-0-0-lj7dx\" (UID: \"235732e9-5820-4c65-99c0-cec3885fa876\") " pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" Apr 16 10:13:18.641867 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.641835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78vv4\" (UniqueName: \"kubernetes.io/projected/235732e9-5820-4c65-99c0-cec3885fa876-kube-api-access-78vv4\") pod \"test-trainjob-r845r-node-0-0-lj7dx\" (UID: \"235732e9-5820-4c65-99c0-cec3885fa876\") " pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" Apr 16 10:13:18.649825 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.649799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vv4\" (UniqueName: \"kubernetes.io/projected/235732e9-5820-4c65-99c0-cec3885fa876-kube-api-access-78vv4\") pod \"test-trainjob-r845r-node-0-0-lj7dx\" (UID: \"235732e9-5820-4c65-99c0-cec3885fa876\") " pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" Apr 16 10:13:18.756826 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.756742 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" Apr 16 10:13:18.877519 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:18.877493 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx"] Apr 16 10:13:18.880342 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:13:18.880313 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod235732e9_5820_4c65_99c0_cec3885fa876.slice/crio-ecbf13f229b44028125911979f59d54c3b96d4a8a260dd95f4380d6ecf540cfc WatchSource:0}: Error finding container ecbf13f229b44028125911979f59d54c3b96d4a8a260dd95f4380d6ecf540cfc: Status 404 returned error can't find the container with id ecbf13f229b44028125911979f59d54c3b96d4a8a260dd95f4380d6ecf540cfc Apr 16 10:13:19.343799 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:19.343758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" event={"ID":"235732e9-5820-4c65-99c0-cec3885fa876","Type":"ContainerStarted","Data":"ecbf13f229b44028125911979f59d54c3b96d4a8a260dd95f4380d6ecf540cfc"} Apr 16 10:13:32.324688 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.324571 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54c67d644c-vwmmw"] Apr 16 10:13:32.349337 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.349297 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54c67d644c-vwmmw"] Apr 16 10:13:32.349523 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.349482 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.486356 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.486321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-console-config\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.486356 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.486361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-service-ca\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.486619 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.486379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2wxh\" (UniqueName: \"kubernetes.io/projected/9b184971-0296-4981-a2a4-1f92d9e4bac0-kube-api-access-g2wxh\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.486619 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.486478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b184971-0296-4981-a2a4-1f92d9e4bac0-console-serving-cert\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.486619 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.486559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-trusted-ca-bundle\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.486619 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.486598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b184971-0296-4981-a2a4-1f92d9e4bac0-console-oauth-config\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.486782 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.486622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-oauth-serving-cert\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.587698 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.587611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-console-config\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.587698 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.587650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-service-ca\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.587698 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.587668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2wxh\" (UniqueName: \"kubernetes.io/projected/9b184971-0296-4981-a2a4-1f92d9e4bac0-kube-api-access-g2wxh\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.587994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.587707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b184971-0296-4981-a2a4-1f92d9e4bac0-console-serving-cert\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.587994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.587737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-trusted-ca-bundle\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.587994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.587766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b184971-0296-4981-a2a4-1f92d9e4bac0-console-oauth-config\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.587994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.587791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-oauth-serving-cert\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.588486 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.588463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-service-ca\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.588603 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.588539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-oauth-serving-cert\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.588712 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.588684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-trusted-ca-bundle\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.588841 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.588756 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b184971-0296-4981-a2a4-1f92d9e4bac0-console-config\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.590452 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.590430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b184971-0296-4981-a2a4-1f92d9e4bac0-console-oauth-config\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.590563 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.590550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b184971-0296-4981-a2a4-1f92d9e4bac0-console-serving-cert\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.596485 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.596463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2wxh\" (UniqueName: \"kubernetes.io/projected/9b184971-0296-4981-a2a4-1f92d9e4bac0-kube-api-access-g2wxh\") pod \"console-54c67d644c-vwmmw\" (UID: \"9b184971-0296-4981-a2a4-1f92d9e4bac0\") " pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:32.662950 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:32.662911 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:36.782115 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:36.782086 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54c67d644c-vwmmw"] Apr 16 10:13:36.783933 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:13:36.783905 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b184971_0296_4981_a2a4_1f92d9e4bac0.slice/crio-e6fb579e1c6f8afea2e14838429d69929e28bbe4302499e61c4a180e7faea40b WatchSource:0}: Error finding container e6fb579e1c6f8afea2e14838429d69929e28bbe4302499e61c4a180e7faea40b: Status 404 returned error can't find the container with id e6fb579e1c6f8afea2e14838429d69929e28bbe4302499e61c4a180e7faea40b Apr 16 10:13:37.438064 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:37.438027 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c67d644c-vwmmw" event={"ID":"9b184971-0296-4981-a2a4-1f92d9e4bac0","Type":"ContainerStarted","Data":"1ac91f9c4bbd6590902031e9a919650fe45e3faa991e49a966b30a86bc270841"} Apr 16 10:13:37.438064 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:37.438063 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c67d644c-vwmmw" event={"ID":"9b184971-0296-4981-a2a4-1f92d9e4bac0","Type":"ContainerStarted","Data":"e6fb579e1c6f8afea2e14838429d69929e28bbe4302499e61c4a180e7faea40b"} Apr 16 10:13:37.457217 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:37.457143 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54c67d644c-vwmmw" podStartSLOduration=5.457122411 podStartE2EDuration="5.457122411s" podCreationTimestamp="2026-04-16 10:13:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:13:37.455230506 +0000 UTC m=+492.291833678" watchObservedRunningTime="2026-04-16 10:13:37.457122411 +0000 UTC m=+492.293725583" Apr 16 10:13:42.663875 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:42.663833 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:42.664433 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:42.663905 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:42.669080 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:42.669058 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:43.464072 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:43.464040 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54c67d644c-vwmmw" Apr 16 10:13:43.508375 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:13:43.508345 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69fbc857dd-27zbk"] Apr 16 10:14:08.533348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.533249 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69fbc857dd-27zbk" podUID="046451eb-ee97-4561-9a0e-e1c3bf6be268" containerName="console" containerID="cri-o://0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f" gracePeriod=15 Apr 16 10:14:08.791322 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.791255 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69fbc857dd-27zbk_046451eb-ee97-4561-9a0e-e1c3bf6be268/console/0.log" Apr 16 10:14:08.791322 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.791318 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:14:08.927637 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.927595 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-oauth-config\") pod \"046451eb-ee97-4561-9a0e-e1c3bf6be268\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " Apr 16 10:14:08.927828 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.927687 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcwd4\" (UniqueName: \"kubernetes.io/projected/046451eb-ee97-4561-9a0e-e1c3bf6be268-kube-api-access-jcwd4\") pod \"046451eb-ee97-4561-9a0e-e1c3bf6be268\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " Apr 16 10:14:08.927828 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.927730 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-service-ca\") pod \"046451eb-ee97-4561-9a0e-e1c3bf6be268\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " Apr 16 10:14:08.927828 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.927766 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-serving-cert\") pod \"046451eb-ee97-4561-9a0e-e1c3bf6be268\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " Apr 16 10:14:08.927828 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.927805 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-trusted-ca-bundle\") pod \"046451eb-ee97-4561-9a0e-e1c3bf6be268\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " Apr 16 10:14:08.928034 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.927836 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-oauth-serving-cert\") pod \"046451eb-ee97-4561-9a0e-e1c3bf6be268\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " Apr 16 10:14:08.928034 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.927864 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-config\") pod \"046451eb-ee97-4561-9a0e-e1c3bf6be268\" (UID: \"046451eb-ee97-4561-9a0e-e1c3bf6be268\") " Apr 16 10:14:08.928249 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.928215 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-service-ca" (OuterVolumeSpecName: "service-ca") pod "046451eb-ee97-4561-9a0e-e1c3bf6be268" (UID: "046451eb-ee97-4561-9a0e-e1c3bf6be268"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:14:08.928371 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.928241 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "046451eb-ee97-4561-9a0e-e1c3bf6be268" (UID: "046451eb-ee97-4561-9a0e-e1c3bf6be268"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:14:08.928371 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.928264 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "046451eb-ee97-4561-9a0e-e1c3bf6be268" (UID: "046451eb-ee97-4561-9a0e-e1c3bf6be268"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:14:08.928495 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.928471 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-config" (OuterVolumeSpecName: "console-config") pod "046451eb-ee97-4561-9a0e-e1c3bf6be268" (UID: "046451eb-ee97-4561-9a0e-e1c3bf6be268"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:14:08.929972 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.929946 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "046451eb-ee97-4561-9a0e-e1c3bf6be268" (UID: "046451eb-ee97-4561-9a0e-e1c3bf6be268"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:14:08.930068 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.930045 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "046451eb-ee97-4561-9a0e-e1c3bf6be268" (UID: "046451eb-ee97-4561-9a0e-e1c3bf6be268"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:14:08.930110 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:08.930054 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046451eb-ee97-4561-9a0e-e1c3bf6be268-kube-api-access-jcwd4" (OuterVolumeSpecName: "kube-api-access-jcwd4") pod "046451eb-ee97-4561-9a0e-e1c3bf6be268" (UID: "046451eb-ee97-4561-9a0e-e1c3bf6be268"). InnerVolumeSpecName "kube-api-access-jcwd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:14:09.028570 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.028530 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-oauth-config\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:14:09.028570 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.028570 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jcwd4\" (UniqueName: \"kubernetes.io/projected/046451eb-ee97-4561-9a0e-e1c3bf6be268-kube-api-access-jcwd4\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:14:09.028784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.028589 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-service-ca\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:14:09.028784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.028605 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-serving-cert\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:14:09.028784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.028619 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-trusted-ca-bundle\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:14:09.028784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.028633 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-oauth-serving-cert\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:14:09.028784 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.028648 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/046451eb-ee97-4561-9a0e-e1c3bf6be268-console-config\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:14:09.561263 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.561230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69fbc857dd-27zbk_046451eb-ee97-4561-9a0e-e1c3bf6be268/console/0.log" Apr 16 10:14:09.561735 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.561273 2575 generic.go:358] "Generic (PLEG): container finished" podID="046451eb-ee97-4561-9a0e-e1c3bf6be268" containerID="0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f" exitCode=2 Apr 16 10:14:09.561735 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.561307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69fbc857dd-27zbk" event={"ID":"046451eb-ee97-4561-9a0e-e1c3bf6be268","Type":"ContainerDied","Data":"0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f"} Apr 16 10:14:09.561735 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.561335 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69fbc857dd-27zbk" event={"ID":"046451eb-ee97-4561-9a0e-e1c3bf6be268","Type":"ContainerDied","Data":"8c3cc96ef0b493845a6de02ebf9c2dbb715cd8e05bd6c2228a551dd1b3dcbcaa"} Apr 16 10:14:09.561735 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.561361 2575 scope.go:117] "RemoveContainer" containerID="0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f" Apr 16 10:14:09.561735 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.561393 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69fbc857dd-27zbk" Apr 16 10:14:09.572407 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.572384 2575 scope.go:117] "RemoveContainer" containerID="0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f" Apr 16 10:14:09.572777 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:14:09.572757 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f\": container with ID starting with 0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f not found: ID does not exist" containerID="0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f" Apr 16 10:14:09.572857 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.572788 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f"} err="failed to get container status \"0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f\": rpc error: code = NotFound desc = could not find container \"0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f\": container with ID starting with 0034b775bc42aea7a09abd4e13b5ac51ef6bee29c13c68373c1913b308b6392f not found: ID does not exist" Apr 16 10:14:09.586822 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.586770 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69fbc857dd-27zbk"] Apr 16 10:14:09.588620 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.588596 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69fbc857dd-27zbk"] Apr 16 10:14:09.733994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:14:09.733955 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046451eb-ee97-4561-9a0e-e1c3bf6be268" path="/var/lib/kubelet/pods/046451eb-ee97-4561-9a0e-e1c3bf6be268/volumes" Apr 16 10:17:50.366278 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:50.366243 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" event={"ID":"235732e9-5820-4c65-99c0-cec3885fa876","Type":"ContainerStarted","Data":"05c6bd008d8fe8ab0b056d5cbb1981849fb0420cf73766543d87a235b74f7183"} Apr 16 10:17:50.397034 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:50.396971 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" podStartSLOduration=1.510510879 podStartE2EDuration="4m32.396955094s" podCreationTimestamp="2026-04-16 10:13:18 +0000 UTC" firstStartedPulling="2026-04-16 10:13:18.882369725 +0000 UTC m=+473.718972875" lastFinishedPulling="2026-04-16 10:17:49.76881394 +0000 UTC m=+744.605417090" observedRunningTime="2026-04-16 10:17:50.387836794 +0000 UTC m=+745.224439975" watchObservedRunningTime="2026-04-16 10:17:50.396955094 +0000 UTC m=+745.233558244" Apr 16 10:17:55.384406 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:55.384319 2575 generic.go:358] "Generic (PLEG): container finished" podID="235732e9-5820-4c65-99c0-cec3885fa876" containerID="05c6bd008d8fe8ab0b056d5cbb1981849fb0420cf73766543d87a235b74f7183" exitCode=0 Apr 16 10:17:55.384866 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:55.384398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" event={"ID":"235732e9-5820-4c65-99c0-cec3885fa876","Type":"ContainerDied","Data":"05c6bd008d8fe8ab0b056d5cbb1981849fb0420cf73766543d87a235b74f7183"} Apr 16 10:17:56.610232 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:56.610205 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" Apr 16 10:17:56.709666 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:56.709585 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78vv4\" (UniqueName: \"kubernetes.io/projected/235732e9-5820-4c65-99c0-cec3885fa876-kube-api-access-78vv4\") pod \"235732e9-5820-4c65-99c0-cec3885fa876\" (UID: \"235732e9-5820-4c65-99c0-cec3885fa876\") " Apr 16 10:17:56.711861 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:56.711826 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235732e9-5820-4c65-99c0-cec3885fa876-kube-api-access-78vv4" (OuterVolumeSpecName: "kube-api-access-78vv4") pod "235732e9-5820-4c65-99c0-cec3885fa876" (UID: "235732e9-5820-4c65-99c0-cec3885fa876"). InnerVolumeSpecName "kube-api-access-78vv4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:17:56.810796 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:56.810760 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-78vv4\" (UniqueName: \"kubernetes.io/projected/235732e9-5820-4c65-99c0-cec3885fa876-kube-api-access-78vv4\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:17:57.392815 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:57.392782 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" Apr 16 10:17:57.392815 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:57.392791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx" event={"ID":"235732e9-5820-4c65-99c0-cec3885fa876","Type":"ContainerDied","Data":"ecbf13f229b44028125911979f59d54c3b96d4a8a260dd95f4380d6ecf540cfc"} Apr 16 10:17:57.392815 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:57.392824 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecbf13f229b44028125911979f59d54c3b96d4a8a260dd95f4380d6ecf540cfc" Apr 16 10:17:58.607012 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.606981 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp"] Apr 16 10:17:58.607428 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.607410 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="046451eb-ee97-4561-9a0e-e1c3bf6be268" containerName="console" Apr 16 10:17:58.607475 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.607430 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="046451eb-ee97-4561-9a0e-e1c3bf6be268" containerName="console" Apr 16 10:17:58.607475 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.607438 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="235732e9-5820-4c65-99c0-cec3885fa876" containerName="node" Apr 16 10:17:58.607475 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.607444 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="235732e9-5820-4c65-99c0-cec3885fa876" containerName="node" Apr 16 10:17:58.607581 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.607498 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="046451eb-ee97-4561-9a0e-e1c3bf6be268" containerName="console" Apr 16 10:17:58.607581 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.607509 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="235732e9-5820-4c65-99c0-cec3885fa876" containerName="node" Apr 16 10:17:58.789364 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.789327 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp"] Apr 16 10:17:58.789534 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.789454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" Apr 16 10:17:58.792377 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.792352 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-hgngd\"/\"openshift-service-ca.crt\"" Apr 16 10:17:58.792377 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.792365 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-hgngd\"/\"kube-root-ca.crt\"" Apr 16 10:17:58.793409 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.793390 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-hgngd\"/\"default-dockercfg-m7s9v\"" Apr 16 10:17:58.930289 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:58.930194 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p69ww\" (UniqueName: \"kubernetes.io/projected/0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19-kube-api-access-p69ww\") pod \"test-trainjob-kqfd8-node-0-0-bpmlp\" (UID: \"0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19\") " pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" Apr 16 10:17:59.031368 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:59.031333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p69ww\" (UniqueName: \"kubernetes.io/projected/0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19-kube-api-access-p69ww\") pod \"test-trainjob-kqfd8-node-0-0-bpmlp\" (UID: \"0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19\") " pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" Apr 16 10:17:59.040124 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:59.040101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p69ww\" (UniqueName: \"kubernetes.io/projected/0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19-kube-api-access-p69ww\") pod \"test-trainjob-kqfd8-node-0-0-bpmlp\" (UID: \"0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19\") " pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" Apr 16 10:17:59.098476 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:59.098443 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" Apr 16 10:17:59.265325 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:59.265299 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp"] Apr 16 10:17:59.267781 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:17:59.267754 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a28c57a_7a03_41c7_b94d_2a1cd8b0ba19.slice/crio-1f1b43bfd60ad715a31d6c60eb8094c7a4a3b4f0eb57e12600cbc665aacde86a WatchSource:0}: Error finding container 1f1b43bfd60ad715a31d6c60eb8094c7a4a3b4f0eb57e12600cbc665aacde86a: Status 404 returned error can't find the container with id 1f1b43bfd60ad715a31d6c60eb8094c7a4a3b4f0eb57e12600cbc665aacde86a Apr 16 10:17:59.269800 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:59.269779 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:17:59.401058 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:17:59.401020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" event={"ID":"0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19","Type":"ContainerStarted","Data":"1f1b43bfd60ad715a31d6c60eb8094c7a4a3b4f0eb57e12600cbc665aacde86a"} Apr 16 10:22:10.353334 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:10.353292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" event={"ID":"0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19","Type":"ContainerStarted","Data":"39283faca35313554f2adbf2e376de673ad3a07d3a791eabb463a25aed36e4f6"} Apr 16 10:22:10.379230 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:10.379174 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" podStartSLOduration=1.917518498 podStartE2EDuration="4m12.379143978s" podCreationTimestamp="2026-04-16 10:17:58 +0000 UTC" firstStartedPulling="2026-04-16 10:17:59.26990704 +0000 UTC m=+754.106510192" lastFinishedPulling="2026-04-16 10:22:09.73153252 +0000 UTC m=+1004.568135672" observedRunningTime="2026-04-16 10:22:10.376317314 +0000 UTC m=+1005.212920485" watchObservedRunningTime="2026-04-16 10:22:10.379143978 +0000 UTC m=+1005.215747149" Apr 16 10:22:17.377138 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:17.377102 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19" containerID="39283faca35313554f2adbf2e376de673ad3a07d3a791eabb463a25aed36e4f6" exitCode=0 Apr 16 10:22:17.377653 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:17.377184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" event={"ID":"0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19","Type":"ContainerDied","Data":"39283faca35313554f2adbf2e376de673ad3a07d3a791eabb463a25aed36e4f6"} Apr 16 10:22:18.675902 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:18.675879 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" Apr 16 10:22:18.841717 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:18.841683 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p69ww\" (UniqueName: \"kubernetes.io/projected/0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19-kube-api-access-p69ww\") pod \"0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19\" (UID: \"0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19\") " Apr 16 10:22:18.844033 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:18.844005 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19-kube-api-access-p69ww" (OuterVolumeSpecName: "kube-api-access-p69ww") pod "0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19" (UID: "0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19"). InnerVolumeSpecName "kube-api-access-p69ww". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:22:18.942464 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:18.942386 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p69ww\" (UniqueName: \"kubernetes.io/projected/0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19-kube-api-access-p69ww\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:22:19.385646 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.385614 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" Apr 16 10:22:19.385816 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.385645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp" event={"ID":"0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19","Type":"ContainerDied","Data":"1f1b43bfd60ad715a31d6c60eb8094c7a4a3b4f0eb57e12600cbc665aacde86a"} Apr 16 10:22:19.385816 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.385680 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f1b43bfd60ad715a31d6c60eb8094c7a4a3b4f0eb57e12600cbc665aacde86a" Apr 16 10:22:19.654458 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.654382 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2"] Apr 16 10:22:19.654787 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.654774 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19" containerName="node" Apr 16 10:22:19.654835 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.654789 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19" containerName="node" Apr 16 10:22:19.654869 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.654850 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19" containerName="node" Apr 16 10:22:19.883228 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.883189 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2"] Apr 16 10:22:19.883642 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.883327 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" Apr 16 10:22:19.885990 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.885967 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-87dcb\"/\"openshift-service-ca.crt\"" Apr 16 10:22:19.886885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.886868 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-87dcb\"/\"kube-root-ca.crt\"" Apr 16 10:22:19.886885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:19.886880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-87dcb\"/\"default-dockercfg-8f2g4\"" Apr 16 10:22:20.050897 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:20.050803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dfhd\" (UniqueName: \"kubernetes.io/projected/4bdba224-6fa0-42bb-b146-dc9c5448d33d-kube-api-access-7dfhd\") pod \"test-trainjob-wtdjj-node-0-0-772z2\" (UID: \"4bdba224-6fa0-42bb-b146-dc9c5448d33d\") " pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" Apr 16 10:22:20.151707 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:20.151665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dfhd\" (UniqueName: \"kubernetes.io/projected/4bdba224-6fa0-42bb-b146-dc9c5448d33d-kube-api-access-7dfhd\") pod \"test-trainjob-wtdjj-node-0-0-772z2\" (UID: \"4bdba224-6fa0-42bb-b146-dc9c5448d33d\") " pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" Apr 16 10:22:20.159674 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:20.159641 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dfhd\" (UniqueName: \"kubernetes.io/projected/4bdba224-6fa0-42bb-b146-dc9c5448d33d-kube-api-access-7dfhd\") pod \"test-trainjob-wtdjj-node-0-0-772z2\" (UID: \"4bdba224-6fa0-42bb-b146-dc9c5448d33d\") " pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" Apr 16 10:22:20.193612 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:20.193578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" Apr 16 10:22:20.315959 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:20.315928 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2"] Apr 16 10:22:20.320149 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:22:20.320120 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bdba224_6fa0_42bb_b146_dc9c5448d33d.slice/crio-6e046ddf97a437c1d7e8ae546f7170e97db0fc39f28217cc5a62973276aa6592 WatchSource:0}: Error finding container 6e046ddf97a437c1d7e8ae546f7170e97db0fc39f28217cc5a62973276aa6592: Status 404 returned error can't find the container with id 6e046ddf97a437c1d7e8ae546f7170e97db0fc39f28217cc5a62973276aa6592 Apr 16 10:22:20.390279 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:22:20.390243 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" event={"ID":"4bdba224-6fa0-42bb-b146-dc9c5448d33d","Type":"ContainerStarted","Data":"6e046ddf97a437c1d7e8ae546f7170e97db0fc39f28217cc5a62973276aa6592"} Apr 16 10:23:34.693492 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:34.693450 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" event={"ID":"4bdba224-6fa0-42bb-b146-dc9c5448d33d","Type":"ContainerStarted","Data":"6144308069ea7085ae7b1cb7578d57fdd8b8904cbbce10d06983f097f6e068b5"} Apr 16 10:23:34.710033 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:34.709975 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" podStartSLOduration=1.7719454350000001 podStartE2EDuration="1m15.709958135s" podCreationTimestamp="2026-04-16 10:22:19 +0000 UTC" firstStartedPulling="2026-04-16 10:22:20.322101753 +0000 UTC m=+1015.158704903" lastFinishedPulling="2026-04-16 10:23:34.260114453 +0000 UTC m=+1089.096717603" observedRunningTime="2026-04-16 10:23:34.707771098 +0000 UTC m=+1089.544374269" watchObservedRunningTime="2026-04-16 10:23:34.709958135 +0000 UTC m=+1089.546561306" Apr 16 10:23:37.706262 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:37.706209 2575 generic.go:358] "Generic (PLEG): container finished" podID="4bdba224-6fa0-42bb-b146-dc9c5448d33d" containerID="6144308069ea7085ae7b1cb7578d57fdd8b8904cbbce10d06983f097f6e068b5" exitCode=0 Apr 16 10:23:37.706654 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:37.706283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" event={"ID":"4bdba224-6fa0-42bb-b146-dc9c5448d33d","Type":"ContainerDied","Data":"6144308069ea7085ae7b1cb7578d57fdd8b8904cbbce10d06983f097f6e068b5"} Apr 16 10:23:38.894128 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:38.894102 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" Apr 16 10:23:38.987610 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:38.987520 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dfhd\" (UniqueName: \"kubernetes.io/projected/4bdba224-6fa0-42bb-b146-dc9c5448d33d-kube-api-access-7dfhd\") pod \"4bdba224-6fa0-42bb-b146-dc9c5448d33d\" (UID: \"4bdba224-6fa0-42bb-b146-dc9c5448d33d\") " Apr 16 10:23:38.989707 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:38.989684 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdba224-6fa0-42bb-b146-dc9c5448d33d-kube-api-access-7dfhd" (OuterVolumeSpecName: "kube-api-access-7dfhd") pod "4bdba224-6fa0-42bb-b146-dc9c5448d33d" (UID: "4bdba224-6fa0-42bb-b146-dc9c5448d33d"). InnerVolumeSpecName "kube-api-access-7dfhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:23:39.088138 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:39.088100 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7dfhd\" (UniqueName: \"kubernetes.io/projected/4bdba224-6fa0-42bb-b146-dc9c5448d33d-kube-api-access-7dfhd\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:23:39.714024 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:39.713996 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" Apr 16 10:23:39.714024 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:39.714005 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2" event={"ID":"4bdba224-6fa0-42bb-b146-dc9c5448d33d","Type":"ContainerDied","Data":"6e046ddf97a437c1d7e8ae546f7170e97db0fc39f28217cc5a62973276aa6592"} Apr 16 10:23:39.714024 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:39.714032 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e046ddf97a437c1d7e8ae546f7170e97db0fc39f28217cc5a62973276aa6592" Apr 16 10:23:40.025594 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.025499 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x"] Apr 16 10:23:40.026043 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.026025 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bdba224-6fa0-42bb-b146-dc9c5448d33d" containerName="node" Apr 16 10:23:40.026106 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.026045 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdba224-6fa0-42bb-b146-dc9c5448d33d" containerName="node" Apr 16 10:23:40.026193 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.026132 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bdba224-6fa0-42bb-b146-dc9c5448d33d" containerName="node" Apr 16 10:23:40.069773 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.069735 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x"] Apr 16 10:23:40.069952 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.069867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" Apr 16 10:23:40.072607 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.072583 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-qqzv7\"/\"openshift-service-ca.crt\"" Apr 16 10:23:40.072723 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.072610 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-qqzv7\"/\"kube-root-ca.crt\"" Apr 16 10:23:40.073499 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.073485 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-qqzv7\"/\"default-dockercfg-2k5h2\"" Apr 16 10:23:40.199183 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.199118 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv99k\" (UniqueName: \"kubernetes.io/projected/59c14b48-1bc0-49e1-a635-207379490871-kube-api-access-gv99k\") pod \"test-trainjob-4sxmt-node-0-0-rrv8x\" (UID: \"59c14b48-1bc0-49e1-a635-207379490871\") " pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" Apr 16 10:23:40.300695 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.300592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv99k\" (UniqueName: \"kubernetes.io/projected/59c14b48-1bc0-49e1-a635-207379490871-kube-api-access-gv99k\") pod \"test-trainjob-4sxmt-node-0-0-rrv8x\" (UID: \"59c14b48-1bc0-49e1-a635-207379490871\") " pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" Apr 16 10:23:40.308680 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.308645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv99k\" (UniqueName: \"kubernetes.io/projected/59c14b48-1bc0-49e1-a635-207379490871-kube-api-access-gv99k\") pod \"test-trainjob-4sxmt-node-0-0-rrv8x\" (UID: \"59c14b48-1bc0-49e1-a635-207379490871\") " pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" Apr 16 10:23:40.379340 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.379295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" Apr 16 10:23:40.502974 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.502951 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x"] Apr 16 10:23:40.505233 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:23:40.505206 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59c14b48_1bc0_49e1_a635_207379490871.slice/crio-ee47ec825008436bf3ebee4af79ccf22321eaf0e6f29a088a1b21f9f5673e875 WatchSource:0}: Error finding container ee47ec825008436bf3ebee4af79ccf22321eaf0e6f29a088a1b21f9f5673e875: Status 404 returned error can't find the container with id ee47ec825008436bf3ebee4af79ccf22321eaf0e6f29a088a1b21f9f5673e875 Apr 16 10:23:40.507356 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.507343 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:23:40.719114 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:23:40.719080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" event={"ID":"59c14b48-1bc0-49e1-a635-207379490871","Type":"ContainerStarted","Data":"ee47ec825008436bf3ebee4af79ccf22321eaf0e6f29a088a1b21f9f5673e875"} Apr 16 10:30:13.225984 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:13.225944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" event={"ID":"59c14b48-1bc0-49e1-a635-207379490871","Type":"ContainerStarted","Data":"992577d6328cf4aa5925021db1a92843f3e6a8739b2df3edcf5b4be08018c3b5"} Apr 16 10:30:13.228412 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:13.228394 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-qqzv7\"/\"default-dockercfg-2k5h2\"" Apr 16 10:30:13.248811 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:13.248767 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" podStartSLOduration=1.136731144 podStartE2EDuration="6m33.248752401s" podCreationTimestamp="2026-04-16 10:23:40 +0000 UTC" firstStartedPulling="2026-04-16 10:23:40.507469633 +0000 UTC m=+1095.344072781" lastFinishedPulling="2026-04-16 10:30:12.619490886 +0000 UTC m=+1487.456094038" observedRunningTime="2026-04-16 10:30:13.247859088 +0000 UTC m=+1488.084462262" watchObservedRunningTime="2026-04-16 10:30:13.248752401 +0000 UTC m=+1488.085355571" Apr 16 10:30:13.291013 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:13.290981 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-qqzv7\"/\"kube-root-ca.crt\"" Apr 16 10:30:13.300872 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:13.300846 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-qqzv7\"/\"openshift-service-ca.crt\"" Apr 16 10:30:16.237380 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:16.237345 2575 generic.go:358] "Generic (PLEG): container finished" podID="59c14b48-1bc0-49e1-a635-207379490871" containerID="992577d6328cf4aa5925021db1a92843f3e6a8739b2df3edcf5b4be08018c3b5" exitCode=0 Apr 16 10:30:16.237778 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:16.237396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" event={"ID":"59c14b48-1bc0-49e1-a635-207379490871","Type":"ContainerDied","Data":"992577d6328cf4aa5925021db1a92843f3e6a8739b2df3edcf5b4be08018c3b5"} Apr 16 10:30:17.384088 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:17.384064 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" Apr 16 10:30:17.531340 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:17.531256 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv99k\" (UniqueName: \"kubernetes.io/projected/59c14b48-1bc0-49e1-a635-207379490871-kube-api-access-gv99k\") pod \"59c14b48-1bc0-49e1-a635-207379490871\" (UID: \"59c14b48-1bc0-49e1-a635-207379490871\") " Apr 16 10:30:17.533484 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:17.533463 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c14b48-1bc0-49e1-a635-207379490871-kube-api-access-gv99k" (OuterVolumeSpecName: "kube-api-access-gv99k") pod "59c14b48-1bc0-49e1-a635-207379490871" (UID: "59c14b48-1bc0-49e1-a635-207379490871"). InnerVolumeSpecName "kube-api-access-gv99k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:30:17.632460 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:17.632423 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gv99k\" (UniqueName: \"kubernetes.io/projected/59c14b48-1bc0-49e1-a635-207379490871-kube-api-access-gv99k\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:30:18.244941 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.244912 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" Apr 16 10:30:18.245111 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.244941 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x" event={"ID":"59c14b48-1bc0-49e1-a635-207379490871","Type":"ContainerDied","Data":"ee47ec825008436bf3ebee4af79ccf22321eaf0e6f29a088a1b21f9f5673e875"} Apr 16 10:30:18.245111 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.244973 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee47ec825008436bf3ebee4af79ccf22321eaf0e6f29a088a1b21f9f5673e875" Apr 16 10:30:18.641374 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.641335 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw"] Apr 16 10:30:18.641767 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.641702 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59c14b48-1bc0-49e1-a635-207379490871" containerName="node" Apr 16 10:30:18.641767 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.641714 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c14b48-1bc0-49e1-a635-207379490871" containerName="node" Apr 16 10:30:18.641845 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.641767 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="59c14b48-1bc0-49e1-a635-207379490871" containerName="node" Apr 16 10:30:18.661568 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.661533 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw"] Apr 16 10:30:18.661722 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.661696 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" Apr 16 10:30:18.663978 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.663931 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-2mwd8\"/\"openshift-service-ca.crt\"" Apr 16 10:30:18.664115 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.663993 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-2mwd8\"/\"kube-root-ca.crt\"" Apr 16 10:30:18.664990 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.664965 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-2mwd8\"/\"default-dockercfg-vkqz6\"" Apr 16 10:30:18.742021 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.741978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqksg\" (UniqueName: \"kubernetes.io/projected/44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda-kube-api-access-xqksg\") pod \"test-trainjob-wh52x-node-0-0-585nw\" (UID: \"44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda\") " pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" Apr 16 10:30:18.843139 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.843099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqksg\" (UniqueName: \"kubernetes.io/projected/44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda-kube-api-access-xqksg\") pod \"test-trainjob-wh52x-node-0-0-585nw\" (UID: \"44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda\") " pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" Apr 16 10:30:18.851011 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.850980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqksg\" (UniqueName: \"kubernetes.io/projected/44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda-kube-api-access-xqksg\") pod \"test-trainjob-wh52x-node-0-0-585nw\" (UID: \"44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda\") " pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" Apr 16 10:30:18.972922 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:18.972826 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" Apr 16 10:30:19.135275 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:19.135247 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw"] Apr 16 10:30:19.137432 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:30:19.137397 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e4c7b9_f470_4c0f_b5f7_c24e6cfeceda.slice/crio-51973e051cc4ecf690f2a785c3fbf091bef4bd64f6175b2837d5776c420ff916 WatchSource:0}: Error finding container 51973e051cc4ecf690f2a785c3fbf091bef4bd64f6175b2837d5776c420ff916: Status 404 returned error can't find the container with id 51973e051cc4ecf690f2a785c3fbf091bef4bd64f6175b2837d5776c420ff916 Apr 16 10:30:19.139217 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:19.139198 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:30:19.249362 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:30:19.249279 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" event={"ID":"44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda","Type":"ContainerStarted","Data":"51973e051cc4ecf690f2a785c3fbf091bef4bd64f6175b2837d5776c420ff916"} Apr 16 10:36:51.770616 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:36:51.770524 2575 eviction_manager.go:376] "Eviction manager: attempting to reclaim" resourceName="ephemeral-storage" Apr 16 10:36:51.770616 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:36:51.770598 2575 container_gc.go:86] "Attempting to delete unused containers" Apr 16 10:36:51.772046 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:36:51.772020 2575 scope.go:117] "RemoveContainer" containerID="39283faca35313554f2adbf2e376de673ad3a07d3a791eabb463a25aed36e4f6" Apr 16 10:37:01.674534 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:37:01.674502 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasDiskPressure" Apr 16 10:38:11.743353 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:38:11.743262 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 16 10:38:11.743353 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:38:11.743312 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 16 10:38:11.743353 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:38:11.743323 2575 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 16 10:38:51.773350 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:38:51.773298 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="39283faca35313554f2adbf2e376de673ad3a07d3a791eabb463a25aed36e4f6" Apr 16 10:38:51.773350 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:38:51.773356 2575 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="39283faca35313554f2adbf2e376de673ad3a07d3a791eabb463a25aed36e4f6" Apr 16 10:38:51.773906 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:38:51.773373 2575 scope.go:117] "RemoveContainer" containerID="e78337b455ed8a78d53cce84a2eb8106ffaaa155807b687513861351422f54bf" Apr 16 10:39:08.794023 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:39:08.793973 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 16 10:39:08.794023 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:39:08.794026 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 16 10:39:08.794535 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:39:08.794038 2575 image_gc_manager.go:222] "Failed to monitor images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 16 10:39:08.798278 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:39:08.798247 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 16 10:39:08.798369 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:39:08.798288 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 16 10:39:08.798369 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:39:08.798306 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 16 10:40:41.743800 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:40:41.743750 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 16 10:40:41.743800 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:40:41.743805 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 16 10:40:41.747847 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:41.743819 2575 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 16 10:40:51.775004 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:40:51.774957 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="e78337b455ed8a78d53cce84a2eb8106ffaaa155807b687513861351422f54bf" Apr 16 10:40:51.775004 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:40:51.775007 2575 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="e78337b455ed8a78d53cce84a2eb8106ffaaa155807b687513861351422f54bf" Apr 16 10:40:51.775481 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:51.775029 2575 scope.go:117] "RemoveContainer" containerID="283d84230342e5f9ae2d24bf600d995cdb13fc95b6ebf37f1ed133080b83ed95" Apr 16 10:40:53.556689 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.556518 2575 scope.go:117] "RemoveContainer" containerID="43673af0eccbc5f645573b8894ace0429eed60069bd2fa194e1cb38fb1e58543" Apr 16 10:40:53.614522 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.614492 2575 scope.go:117] "RemoveContainer" containerID="6144308069ea7085ae7b1cb7578d57fdd8b8904cbbce10d06983f097f6e068b5" Apr 16 10:40:53.631136 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.631114 2575 scope.go:117] "RemoveContainer" containerID="b285297cd7e442d1abd200f57e67a5f958e1ef20a740358ef4b2b3291e84ffa1" Apr 16 10:40:53.639620 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.639597 2575 scope.go:117] "RemoveContainer" containerID="f203beb66d97ad620a4f640b6a7f4a09a8d7e89a799b535564736bf1f32c65a6" Apr 16 10:40:53.647274 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.647249 2575 scope.go:117] "RemoveContainer" containerID="992577d6328cf4aa5925021db1a92843f3e6a8739b2df3edcf5b4be08018c3b5" Apr 16 10:40:53.655637 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.655615 2575 scope.go:117] "RemoveContainer" containerID="05c6bd008d8fe8ab0b056d5cbb1981849fb0420cf73766543d87a235b74f7183" Apr 16 10:40:53.678559 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.678535 2575 scope.go:117] "RemoveContainer" containerID="a8087414de649b8b4d7a22a38309f624d685375d6ccc69dfc2716a46f4a454f8" Apr 16 10:40:53.712548 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.712529 2575 image_gc_manager.go:447] "Attempting to delete unused images" Apr 16 10:40:53.735564 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.735536 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="3038b4f9b9b980b9b22e6ca050370d17ba44cc8e44a875b6e8bffe0bb887ba51" size=1065432607 runtimeHandler="" Apr 16 10:40:53.819609 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:53.819577 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="ddcfeee20e1ff551b445c3628005b77e942ea4ac3d8392e643c50a8a475c3949" size=1064972629 runtimeHandler="" Apr 16 10:40:54.294387 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:40:54.294329 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_ExperimentalGrid_Contraction_l_Ailk_Bljk_Cijk_Dijk_gfx90a.co: no space left on device); artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6" Apr 16 10:40:54.294598 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:40:54.294555 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:node,Image:quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6,Command:[python -c import torch; print(f'PyTorch version: {torch.__version__}'); print('Training completed successfully')],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:29500,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:PET_NNODES,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NPROC_PER_NODE,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NODE_RANK,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PET_MASTER_ADDR,Value:test-trainjob-wh52x-node-0-0.test-trainjob-wh52x,ValueFrom:nil,},EnvVar{Name:PET_MASTER_PORT,Value:29500,ValueFrom:nil,},EnvVar{Name:JOB_COMPLETION_INDEX,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xqksg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-trainjob-wh52x-node-0-0-585nw_test-ns-2mwd8(44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_ExperimentalGrid_Contraction_l_Ailk_Bljk_Cijk_Dijk_gfx90a.co: no space left on device); artifact err: provided artifact is a container image" logger="UnhandledError" Apr 16 10:40:54.295881 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:40:54.295845 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_ExperimentalGrid_Contraction_l_Ailk_Bljk_Cijk_Dijk_gfx90a.co: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" podUID="44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda" Apr 16 10:40:54.359937 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:54.359850 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="c0c3ed5dc84741877d461ff7dbe92002f04b484245c9f619a2a8efbd91e31c87" size=884093158 runtimeHandler="" Apr 16 10:40:54.384239 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:54.384205 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="a3bcd229da71abee5d72332f1a8c7b329bc17fbc4a435273284d7245e4dc3aa5" size=108352841 runtimeHandler="" Apr 16 10:40:54.396792 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:54.396763 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="0f7444f43d8ac64689b66c4ca32ac59671a4f9c4a5cc71a76867d7504761958b" size=977380827 runtimeHandler="" Apr 16 10:40:54.422241 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:54.422211 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="ba0d5ab4eb24f99d84ae4923fefa85e3ab5042c1e554dcca3a41789529499633" size=107183730 runtimeHandler="" Apr 16 10:40:54.432916 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:54.432887 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 16 10:40:54.635942 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:54.635859 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-2mwd8\"/\"default-dockercfg-vkqz6\"" Apr 16 10:40:54.747397 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:54.747366 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-2mwd8\"/\"kube-root-ca.crt\"" Apr 16 10:40:54.757261 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:54.757235 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-2mwd8\"/\"openshift-service-ca.crt\"" Apr 16 10:40:58.367121 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:58.366662 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="65e84aed5e78009bfc4af6cd682cca48f00a6e4317ab8d2b37f037ee1b735dc8" size=23199586225 runtimeHandler="" Apr 16 10:40:58.367121 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:40:58.367047 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:40:58.367493 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:40:58.367337 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_ExperimentalGrid_Contraction_l_Ailk_Bljk_Cijk_Dijk_gfx90a.co: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" podUID="44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda" Apr 16 10:41:02.420210 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:41:02.420167 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="1f369340181d991e38175950af0cc351e6c2eaddca22075fb9259c214ef5b9c9" size=7588072888 runtimeHandler="" Apr 16 10:41:05.807199 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:41:05.807145 2575 image_gc_manager.go:514] "Removing image to free bytes" imageID="819e15fdec92d846e6d5de4b1b2988adcb74f6d3046689fe03c655b03a67975d" size=18873458221 runtimeHandler="" Apr 16 10:41:08.804018 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:41:08.803942 2575 eviction_manager.go:383] "Eviction manager: able to reduce resource pressure without evicting pods." resourceName="ephemeral-storage" Apr 16 10:42:08.022419 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:42:08.022386 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-215.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:44:08.821127 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:44:08.821057 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 10:48:11.778418 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:48:11.778371 2575 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 16 10:48:11.778418 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:48:11.778422 2575 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 16 10:48:11.868924 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:48:11.778432 2575 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 16 10:50:18.628897 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:50:18.628814 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"992577d6328cf4aa5925021db1a92843f3e6a8739b2df3edcf5b4be08018c3b5\": container with ID starting with 992577d6328cf4aa5925021db1a92843f3e6a8739b2df3edcf5b4be08018c3b5 not found: ID does not exist" containerID="992577d6328cf4aa5925021db1a92843f3e6a8739b2df3edcf5b4be08018c3b5" Apr 16 10:50:18.729765 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:50:18.729720 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6144308069ea7085ae7b1cb7578d57fdd8b8904cbbce10d06983f097f6e068b5\": container with ID starting with 6144308069ea7085ae7b1cb7578d57fdd8b8904cbbce10d06983f097f6e068b5 not found: ID does not exist" containerID="6144308069ea7085ae7b1cb7578d57fdd8b8904cbbce10d06983f097f6e068b5" Apr 16 10:50:18.829767 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:50:18.829729 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39283faca35313554f2adbf2e376de673ad3a07d3a791eabb463a25aed36e4f6\": container with ID starting with 39283faca35313554f2adbf2e376de673ad3a07d3a791eabb463a25aed36e4f6 not found: ID does not exist" containerID="39283faca35313554f2adbf2e376de673ad3a07d3a791eabb463a25aed36e4f6" Apr 16 10:50:19.325965 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:50:19.325930 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c6bd008d8fe8ab0b056d5cbb1981849fb0420cf73766543d87a235b74f7183\": container with ID starting with 05c6bd008d8fe8ab0b056d5cbb1981849fb0420cf73766543d87a235b74f7183 not found: ID does not exist" containerID="05c6bd008d8fe8ab0b056d5cbb1981849fb0420cf73766543d87a235b74f7183" Apr 16 10:50:23.586912 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:23.586875 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw"] Apr 16 10:50:23.683443 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:23.683401 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x"] Apr 16 10:50:23.686848 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:23.686820 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-qqzv7/test-trainjob-4sxmt-node-0-0-rrv8x"] Apr 16 10:50:23.733136 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:23.733093 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c14b48-1bc0-49e1-a635-207379490871" path="/var/lib/kubelet/pods/59c14b48-1bc0-49e1-a635-207379490871/volumes" Apr 16 10:50:23.763946 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:23.763905 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2"] Apr 16 10:50:23.771430 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:23.771396 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-87dcb/test-trainjob-wtdjj-node-0-0-772z2"] Apr 16 10:50:23.962934 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:23.962846 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp"] Apr 16 10:50:23.969776 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:23.969743 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-hgngd/test-trainjob-kqfd8-node-0-0-bpmlp"] Apr 16 10:50:24.579851 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:24.579806 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx"] Apr 16 10:50:24.586966 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:24.586933 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-zrbm8/test-trainjob-r845r-node-0-0-lj7dx"] Apr 16 10:50:25.735106 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:25.735068 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19" path="/var/lib/kubelet/pods/0a28c57a-7a03-41c7-b94d-2a1cd8b0ba19/volumes" Apr 16 10:50:25.735579 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:25.735556 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235732e9-5820-4c65-99c0-cec3885fa876" path="/var/lib/kubelet/pods/235732e9-5820-4c65-99c0-cec3885fa876/volumes" Apr 16 10:50:25.735951 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:25.735930 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bdba224-6fa0-42bb-b146-dc9c5448d33d" path="/var/lib/kubelet/pods/4bdba224-6fa0-42bb-b146-dc9c5448d33d/volumes" Apr 16 10:50:29.674219 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:29.674171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" event={"ID":"44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda","Type":"ContainerStarted","Data":"1bbb171d01c74e783acef63e3905a14180d6d793c47464ca43d948ad2217bf32"} Apr 16 10:50:29.674219 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:29.674200 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" podUID="44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda" containerName="node" containerID="cri-o://1bbb171d01c74e783acef63e3905a14180d6d793c47464ca43d948ad2217bf32" gracePeriod=30 Apr 16 10:50:29.695725 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:29.695666 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" podStartSLOduration=1.705638101 podStartE2EDuration="20m11.695651991s" podCreationTimestamp="2026-04-16 10:30:18 +0000 UTC" firstStartedPulling="2026-04-16 10:30:19.139325612 +0000 UTC m=+1493.975928761" lastFinishedPulling="2026-04-16 10:50:29.1293395 +0000 UTC m=+2703.965942651" observedRunningTime="2026-04-16 10:50:29.694210915 +0000 UTC m=+2704.530814087" watchObservedRunningTime="2026-04-16 10:50:29.695651991 +0000 UTC m=+2704.532255161" Apr 16 10:50:45.726465 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:45.726432 2575 generic.go:358] "Generic (PLEG): container finished" podID="44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda" containerID="1bbb171d01c74e783acef63e3905a14180d6d793c47464ca43d948ad2217bf32" exitCode=0 Apr 16 10:50:45.726874 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:45.726503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" event={"ID":"44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda","Type":"ContainerDied","Data":"1bbb171d01c74e783acef63e3905a14180d6d793c47464ca43d948ad2217bf32"} Apr 16 10:50:45.726874 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:45.726542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" event={"ID":"44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda","Type":"ContainerDied","Data":"51973e051cc4ecf690f2a785c3fbf091bef4bd64f6175b2837d5776c420ff916"} Apr 16 10:50:45.726874 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:45.726554 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51973e051cc4ecf690f2a785c3fbf091bef4bd64f6175b2837d5776c420ff916" Apr 16 10:50:45.730554 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:45.730534 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" Apr 16 10:50:45.778536 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:45.778454 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqksg\" (UniqueName: \"kubernetes.io/projected/44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda-kube-api-access-xqksg\") pod \"44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda\" (UID: \"44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda\") " Apr 16 10:50:45.780743 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:45.780707 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda-kube-api-access-xqksg" (OuterVolumeSpecName: "kube-api-access-xqksg") pod "44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda" (UID: "44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda"). InnerVolumeSpecName "kube-api-access-xqksg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:50:45.878997 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:45.878957 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqksg\" (UniqueName: \"kubernetes.io/projected/44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda-kube-api-access-xqksg\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:50:46.730577 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:46.730540 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw" Apr 16 10:50:46.752829 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:46.752800 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw"] Apr 16 10:50:46.757192 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:46.757143 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-2mwd8/test-trainjob-wh52x-node-0-0-585nw"] Apr 16 10:50:47.733292 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:50:47.733255 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda" path="/var/lib/kubelet/pods/44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda/volumes" Apr 16 10:52:24.428070 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.428032 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5"] Apr 16 10:52:24.428548 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.428402 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda" containerName="node" Apr 16 10:52:24.428548 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.428414 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda" containerName="node" Apr 16 10:52:24.428548 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.428475 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="44e4c7b9-f470-4c0f-b5f7-c24e6cfeceda" containerName="node" Apr 16 10:52:24.431366 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.431351 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:24.433768 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.433749 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-28lhx\"/\"kube-root-ca.crt\"" Apr 16 10:52:24.434008 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.433988 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-28lhx\"/\"openshift-service-ca.crt\"" Apr 16 10:52:24.434779 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.434762 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-28lhx\"/\"default-dockercfg-wbmph\"" Apr 16 10:52:24.438980 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.438954 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5"] Apr 16 10:52:24.538919 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.538874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbpk5\" (UniqueName: \"kubernetes.io/projected/4f09ee24-f2d9-4312-9106-875f4ab6e710-kube-api-access-vbpk5\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") " pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:24.640415 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.640374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbpk5\" (UniqueName: \"kubernetes.io/projected/4f09ee24-f2d9-4312-9106-875f4ab6e710-kube-api-access-vbpk5\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") " pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:24.648681 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:24.648644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbpk5\" (UniqueName: \"kubernetes.io/projected/4f09ee24-f2d9-4312-9106-875f4ab6e710-kube-api-access-vbpk5\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") " pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:33.214690 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.214641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") " pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:33.220734 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.220706 2575 operation_generator.go:1469] "Controller attach succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") device path: \"\"" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:33.315072 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.315019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") " pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:33.315288 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.315184 2575 operation_generator.go:515] "MountVolume.WaitForAttach entering for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") DevicePath \"\"" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:33.318759 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.318737 2575 operation_generator.go:525] "MountVolume.WaitForAttach succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") DevicePath \"csi-8ccb3d5c8d7218586eec0f9fb4e2f636fdc145976bfe93616b9b10516756c9ca\"" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:33.468169 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.468084 2575 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/ebs.csi.aws.com/8937ea088fea58cb063953fb1ba1725a076f210879b1866822286697fdc5e4b1/globalmount\"" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:33.484308 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.484278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-node-0-0-c9kd5\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") " pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:33.745131 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.745042 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-28lhx\"/\"default-dockercfg-wbmph\"" Apr 16 10:52:33.753080 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.753060 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:52:33.877748 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.877719 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5"] Apr 16 10:52:33.879763 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:52:33.879737 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f09ee24_f2d9_4312_9106_875f4ab6e710.slice/crio-24680ef5eff7d17d32402d914065623ea06723fbba16c0eb0c56fa2b8973e5f9 WatchSource:0}: Error finding container 24680ef5eff7d17d32402d914065623ea06723fbba16c0eb0c56fa2b8973e5f9: Status 404 returned error can't find the container with id 24680ef5eff7d17d32402d914065623ea06723fbba16c0eb0c56fa2b8973e5f9 Apr 16 10:52:33.881670 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:33.881654 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:52:34.094459 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:52:34.094422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" event={"ID":"4f09ee24-f2d9-4312-9106-875f4ab6e710","Type":"ContainerStarted","Data":"24680ef5eff7d17d32402d914065623ea06723fbba16c0eb0c56fa2b8973e5f9"} Apr 16 10:53:45.393030 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:45.392992 2575 generic.go:358] "Generic (PLEG): container finished" podID="4f09ee24-f2d9-4312-9106-875f4ab6e710" containerID="837568ceaa952b40c92791f4f09604f42eafc250557d574c6c91f1bbc2fe228b" exitCode=0 Apr 16 10:53:45.393534 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:45.393082 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" event={"ID":"4f09ee24-f2d9-4312-9106-875f4ab6e710","Type":"ContainerDied","Data":"837568ceaa952b40c92791f4f09604f42eafc250557d574c6c91f1bbc2fe228b"} Apr 16 10:53:46.531553 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:46.531525 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:53:46.614304 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:46.614266 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbpk5\" (UniqueName: \"kubernetes.io/projected/4f09ee24-f2d9-4312-9106-875f4ab6e710-kube-api-access-vbpk5\") pod \"4f09ee24-f2d9-4312-9106-875f4ab6e710\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") " Apr 16 10:53:46.614458 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:46.614447 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workspace\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"4f09ee24-f2d9-4312-9106-875f4ab6e710\" (UID: \"4f09ee24-f2d9-4312-9106-875f4ab6e710\") " Apr 16 10:53:46.616483 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:46.616452 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f09ee24-f2d9-4312-9106-875f4ab6e710-kube-api-access-vbpk5" (OuterVolumeSpecName: "kube-api-access-vbpk5") pod "4f09ee24-f2d9-4312-9106-875f4ab6e710" (UID: "4f09ee24-f2d9-4312-9106-875f4ab6e710"). InnerVolumeSpecName "kube-api-access-vbpk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:53:46.617104 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:46.617081 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60" (OuterVolumeSpecName: "workspace") pod "4f09ee24-f2d9-4312-9106-875f4ab6e710" (UID: "4f09ee24-f2d9-4312-9106-875f4ab6e710"). InnerVolumeSpecName "pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306". PluginName "kubernetes.io/csi", VolumeGIDValue "" Apr 16 10:53:46.715607 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:46.715527 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vbpk5\" (UniqueName: \"kubernetes.io/projected/4f09ee24-f2d9-4312-9106-875f4ab6e710-kube-api-access-vbpk5\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:53:46.715607 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:46.715579 2575 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") on node \"ip-10-0-135-215.ec2.internal\" " Apr 16 10:53:46.735777 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:46.735748 2575 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306" (UniqueName: "kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60") on node "ip-10-0-135-215.ec2.internal" Apr 16 10:53:46.816612 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:46.816574 2575 reconciler_common.go:299] "Volume detached for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"csi-8ccb3d5c8d7218586eec0f9fb4e2f636fdc145976bfe93616b9b10516756c9ca\"" Apr 16 10:53:47.401341 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:47.401301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" event={"ID":"4f09ee24-f2d9-4312-9106-875f4ab6e710","Type":"ContainerDied","Data":"24680ef5eff7d17d32402d914065623ea06723fbba16c0eb0c56fa2b8973e5f9"} Apr 16 10:53:47.401341 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:47.401335 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24680ef5eff7d17d32402d914065623ea06723fbba16c0eb0c56fa2b8973e5f9" Apr 16 10:53:47.401596 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:47.401349 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5" Apr 16 10:53:47.777619 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:47.777545 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-28lhx_test-trainjob-4xc4f-node-0-0-c9kd5_4f09ee24-f2d9-4312-9106-875f4ab6e710/node/0.log" Apr 16 10:53:52.828750 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:52.828711 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5"] Apr 16 10:53:52.834827 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:52.834800 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-node-0-0-c9kd5"] Apr 16 10:53:53.734872 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:53:53.734833 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f09ee24-f2d9-4312-9106-875f4ab6e710" path="/var/lib/kubelet/pods/4f09ee24-f2d9-4312-9106-875f4ab6e710/volumes" Apr 16 10:54:01.485937 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.485897 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b48wm/must-gather-2qdtg"] Apr 16 10:54:01.486375 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.486316 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f09ee24-f2d9-4312-9106-875f4ab6e710" containerName="node" Apr 16 10:54:01.486375 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.486330 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f09ee24-f2d9-4312-9106-875f4ab6e710" containerName="node" Apr 16 10:54:01.486454 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.486387 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f09ee24-f2d9-4312-9106-875f4ab6e710" containerName="node" Apr 16 10:54:01.489320 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.489301 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:54:01.491919 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.491896 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b48wm\"/\"kube-root-ca.crt\"" Apr 16 10:54:01.492924 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.492906 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b48wm\"/\"default-dockercfg-vcjxv\"" Apr 16 10:54:01.493002 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.492906 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b48wm\"/\"openshift-service-ca.crt\"" Apr 16 10:54:01.497205 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.497181 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b48wm/must-gather-2qdtg"] Apr 16 10:54:01.646420 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.646377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9cba92a-4b43-4282-af33-2c578c734436-must-gather-output\") pod \"must-gather-2qdtg\" (UID: \"f9cba92a-4b43-4282-af33-2c578c734436\") " pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:54:01.646584 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.646437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n949f\" (UniqueName: \"kubernetes.io/projected/f9cba92a-4b43-4282-af33-2c578c734436-kube-api-access-n949f\") pod \"must-gather-2qdtg\" (UID: \"f9cba92a-4b43-4282-af33-2c578c734436\") " pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:54:01.748021 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.747940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9cba92a-4b43-4282-af33-2c578c734436-must-gather-output\") pod \"must-gather-2qdtg\" (UID: \"f9cba92a-4b43-4282-af33-2c578c734436\") " pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:54:01.748021 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.747981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n949f\" (UniqueName: \"kubernetes.io/projected/f9cba92a-4b43-4282-af33-2c578c734436-kube-api-access-n949f\") pod \"must-gather-2qdtg\" (UID: \"f9cba92a-4b43-4282-af33-2c578c734436\") " pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:54:01.748319 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.748298 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9cba92a-4b43-4282-af33-2c578c734436-must-gather-output\") pod \"must-gather-2qdtg\" (UID: \"f9cba92a-4b43-4282-af33-2c578c734436\") " pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:54:01.756480 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.756455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n949f\" (UniqueName: \"kubernetes.io/projected/f9cba92a-4b43-4282-af33-2c578c734436-kube-api-access-n949f\") pod \"must-gather-2qdtg\" (UID: \"f9cba92a-4b43-4282-af33-2c578c734436\") " pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:54:01.799639 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.799605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:54:01.933111 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:01.933086 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b48wm/must-gather-2qdtg"] Apr 16 10:54:01.935667 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:54:01.935632 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9cba92a_4b43_4282_af33_2c578c734436.slice/crio-19ca54d668a996763af46298341f1e496e0655271fbab364561de6821ceabe4f WatchSource:0}: Error finding container 19ca54d668a996763af46298341f1e496e0655271fbab364561de6821ceabe4f: Status 404 returned error can't find the container with id 19ca54d668a996763af46298341f1e496e0655271fbab364561de6821ceabe4f Apr 16 10:54:02.452107 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:02.452071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b48wm/must-gather-2qdtg" event={"ID":"f9cba92a-4b43-4282-af33-2c578c734436","Type":"ContainerStarted","Data":"19ca54d668a996763af46298341f1e496e0655271fbab364561de6821ceabe4f"} Apr 16 10:54:07.472802 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:07.472772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b48wm/must-gather-2qdtg" event={"ID":"f9cba92a-4b43-4282-af33-2c578c734436","Type":"ContainerStarted","Data":"8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b"} Apr 16 10:54:08.478496 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:08.478444 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b48wm/must-gather-2qdtg" event={"ID":"f9cba92a-4b43-4282-af33-2c578c734436","Type":"ContainerStarted","Data":"9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc"} Apr 16 10:54:08.495857 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:08.495796 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b48wm/must-gather-2qdtg" podStartSLOduration=2.1317216820000002 podStartE2EDuration="7.495779595s" podCreationTimestamp="2026-04-16 10:54:01 +0000 UTC" firstStartedPulling="2026-04-16 10:54:01.937474266 +0000 UTC m=+2916.774077415" lastFinishedPulling="2026-04-16 10:54:07.301532177 +0000 UTC m=+2922.138135328" observedRunningTime="2026-04-16 10:54:08.495485295 +0000 UTC m=+2923.332088466" watchObservedRunningTime="2026-04-16 10:54:08.495779595 +0000 UTC m=+2923.332382767" Apr 16 10:54:58.669245 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:58.669207 2575 generic.go:358] "Generic (PLEG): container finished" podID="f9cba92a-4b43-4282-af33-2c578c734436" containerID="8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b" exitCode=0 Apr 16 10:54:58.669656 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:58.669286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b48wm/must-gather-2qdtg" event={"ID":"f9cba92a-4b43-4282-af33-2c578c734436","Type":"ContainerDied","Data":"8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b"} Apr 16 10:54:58.669656 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:58.669625 2575 scope.go:117] "RemoveContainer" containerID="8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b" Apr 16 10:54:58.712035 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:54:58.712004 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b48wm_must-gather-2qdtg_f9cba92a-4b43-4282-af33-2c578c734436/gather/0.log" Apr 16 10:55:01.865453 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:01.865428 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cgvfp_db3c6315-5043-4081-9649-b6ca08d1fb33/global-pull-secret-syncer/0.log" Apr 16 10:55:02.009578 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:02.009543 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cjccl_5bf46483-1c9c-44d9-8737-763a361a473f/konnectivity-agent/0.log" Apr 16 10:55:02.163086 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:02.163003 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-215.ec2.internal_42c5bf6a0ae41a4c6f004afe4db8cd52/haproxy/0.log" Apr 16 10:55:04.077822 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.077783 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b48wm/must-gather-2qdtg"] Apr 16 10:55:04.078332 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.077994 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-b48wm/must-gather-2qdtg" podUID="f9cba92a-4b43-4282-af33-2c578c734436" containerName="copy" containerID="cri-o://9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc" gracePeriod=2 Apr 16 10:55:04.084201 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.084175 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b48wm/must-gather-2qdtg"] Apr 16 10:55:04.312852 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.312827 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b48wm_must-gather-2qdtg_f9cba92a-4b43-4282-af33-2c578c734436/copy/0.log" Apr 16 10:55:04.313194 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.313178 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:55:04.315380 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.315355 2575 status_manager.go:895] "Failed to get status for pod" podUID="f9cba92a-4b43-4282-af33-2c578c734436" pod="openshift-must-gather-b48wm/must-gather-2qdtg" err="pods \"must-gather-2qdtg\" is forbidden: User \"system:node:ip-10-0-135-215.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b48wm\": no relationship found between node 'ip-10-0-135-215.ec2.internal' and this object" Apr 16 10:55:04.420960 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.420867 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9cba92a-4b43-4282-af33-2c578c734436-must-gather-output\") pod \"f9cba92a-4b43-4282-af33-2c578c734436\" (UID: \"f9cba92a-4b43-4282-af33-2c578c734436\") " Apr 16 10:55:04.420960 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.420944 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n949f\" (UniqueName: \"kubernetes.io/projected/f9cba92a-4b43-4282-af33-2c578c734436-kube-api-access-n949f\") pod \"f9cba92a-4b43-4282-af33-2c578c734436\" (UID: \"f9cba92a-4b43-4282-af33-2c578c734436\") " Apr 16 10:55:04.423138 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.423103 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9cba92a-4b43-4282-af33-2c578c734436-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f9cba92a-4b43-4282-af33-2c578c734436" (UID: "f9cba92a-4b43-4282-af33-2c578c734436"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:55:04.423275 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.423144 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9cba92a-4b43-4282-af33-2c578c734436-kube-api-access-n949f" (OuterVolumeSpecName: "kube-api-access-n949f") pod "f9cba92a-4b43-4282-af33-2c578c734436" (UID: "f9cba92a-4b43-4282-af33-2c578c734436"). InnerVolumeSpecName "kube-api-access-n949f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:55:04.522651 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.522611 2575 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9cba92a-4b43-4282-af33-2c578c734436-must-gather-output\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:55:04.522651 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.522647 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n949f\" (UniqueName: \"kubernetes.io/projected/f9cba92a-4b43-4282-af33-2c578c734436-kube-api-access-n949f\") on node \"ip-10-0-135-215.ec2.internal\" DevicePath \"\"" Apr 16 10:55:04.690259 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.690176 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b48wm_must-gather-2qdtg_f9cba92a-4b43-4282-af33-2c578c734436/copy/0.log" Apr 16 10:55:04.690523 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.690500 2575 generic.go:358] "Generic (PLEG): container finished" podID="f9cba92a-4b43-4282-af33-2c578c734436" containerID="9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc" exitCode=143 Apr 16 10:55:04.690594 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.690552 2575 scope.go:117] "RemoveContainer" containerID="9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc" Apr 16 10:55:04.690594 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.690556 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b48wm/must-gather-2qdtg" Apr 16 10:55:04.693211 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.693182 2575 status_manager.go:895] "Failed to get status for pod" podUID="f9cba92a-4b43-4282-af33-2c578c734436" pod="openshift-must-gather-b48wm/must-gather-2qdtg" err="pods \"must-gather-2qdtg\" is forbidden: User \"system:node:ip-10-0-135-215.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b48wm\": no relationship found between node 'ip-10-0-135-215.ec2.internal' and this object" Apr 16 10:55:04.698933 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.698873 2575 scope.go:117] "RemoveContainer" containerID="8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b" Apr 16 10:55:04.701416 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.701380 2575 status_manager.go:895] "Failed to get status for pod" podUID="f9cba92a-4b43-4282-af33-2c578c734436" pod="openshift-must-gather-b48wm/must-gather-2qdtg" err="pods \"must-gather-2qdtg\" is forbidden: User \"system:node:ip-10-0-135-215.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b48wm\": no relationship found between node 'ip-10-0-135-215.ec2.internal' and this object" Apr 16 10:55:04.710379 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.710360 2575 scope.go:117] "RemoveContainer" containerID="9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc" Apr 16 10:55:04.710631 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:55:04.710605 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc\": container with ID starting with 9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc not found: ID does not exist" containerID="9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc" Apr 16 10:55:04.710674 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.710633 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc"} err="failed to get container status \"9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc\": rpc error: code = NotFound desc = could not find container \"9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc\": container with ID starting with 9066baaa9a7c6ee44de6b339519987084045cd95714fd2662c8ec819134634bc not found: ID does not exist" Apr 16 10:55:04.710674 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.710655 2575 scope.go:117] "RemoveContainer" containerID="8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b" Apr 16 10:55:04.710879 ip-10-0-135-215 kubenswrapper[2575]: E0416 10:55:04.710865 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b\": container with ID starting with 8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b not found: ID does not exist" containerID="8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b" Apr 16 10:55:04.710915 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:04.710884 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b"} err="failed to get container status \"8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b\": rpc error: code = NotFound desc = could not find container \"8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b\": container with ID starting with 8b634ca38372b9d30a58669c4be573b4d1cacb638a69a6a240d55d6fe14c961b not found: ID does not exist" Apr 16 10:55:05.192039 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.191997 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fa2b73b5-5c87-4838-92f1-ee572b2bdc8f/alertmanager/0.log" Apr 16 10:55:05.268936 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.268909 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fa2b73b5-5c87-4838-92f1-ee572b2bdc8f/config-reloader/0.log" Apr 16 10:55:05.325467 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.325423 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fa2b73b5-5c87-4838-92f1-ee572b2bdc8f/kube-rbac-proxy-web/0.log" Apr 16 10:55:05.366985 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.366956 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fa2b73b5-5c87-4838-92f1-ee572b2bdc8f/kube-rbac-proxy/0.log" Apr 16 10:55:05.416313 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.416285 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fa2b73b5-5c87-4838-92f1-ee572b2bdc8f/kube-rbac-proxy-metric/0.log" Apr 16 10:55:05.444927 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.444899 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fa2b73b5-5c87-4838-92f1-ee572b2bdc8f/prom-label-proxy/0.log" Apr 16 10:55:05.503951 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.503922 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fa2b73b5-5c87-4838-92f1-ee572b2bdc8f/init-config-reloader/0.log" Apr 16 10:55:05.577004 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.576923 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-pgz48_a6b84455-6de2-4e18-9ce9-064521f88942/kube-state-metrics/0.log" Apr 16 10:55:05.618221 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.618185 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-pgz48_a6b84455-6de2-4e18-9ce9-064521f88942/kube-rbac-proxy-main/0.log" Apr 16 10:55:05.674296 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.674269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-pgz48_a6b84455-6de2-4e18-9ce9-064521f88942/kube-rbac-proxy-self/0.log" Apr 16 10:55:05.733718 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.733686 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9cba92a-4b43-4282-af33-2c578c734436" path="/var/lib/kubelet/pods/f9cba92a-4b43-4282-af33-2c578c734436/volumes" Apr 16 10:55:05.735270 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.735244 2575 status_manager.go:895] "Failed to get status for pod" podUID="f9cba92a-4b43-4282-af33-2c578c734436" pod="openshift-must-gather-b48wm/must-gather-2qdtg" err="pods \"must-gather-2qdtg\" is forbidden: User \"system:node:ip-10-0-135-215.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b48wm\": no relationship found between node 'ip-10-0-135-215.ec2.internal' and this object" Apr 16 10:55:05.790149 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.790119 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-gd6c7_5d767b9e-377e-4435-b350-6d44a9d2b985/monitoring-plugin/0.log" Apr 16 10:55:05.960444 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.960365 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9lz6x_24678d87-1b27-4719-b309-5f2c3c4150d3/node-exporter/0.log" Apr 16 10:55:05.999057 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:05.999028 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9lz6x_24678d87-1b27-4719-b309-5f2c3c4150d3/kube-rbac-proxy/0.log" Apr 16 10:55:06.057674 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:06.057647 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9lz6x_24678d87-1b27-4719-b309-5f2c3c4150d3/init-textfile/0.log" Apr 16 10:55:06.269866 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:06.269833 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-z9rb5_1fd12477-bb9e-4a4a-9298-5e067ebfd148/kube-rbac-proxy-main/0.log" Apr 16 10:55:06.314043 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:06.314018 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-z9rb5_1fd12477-bb9e-4a4a-9298-5e067ebfd148/kube-rbac-proxy-self/0.log" Apr 16 10:55:06.357526 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:06.357501 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-z9rb5_1fd12477-bb9e-4a4a-9298-5e067ebfd148/openshift-state-metrics/0.log" Apr 16 10:55:06.886936 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:06.886904 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-s7rt6_c0790d3e-5135-4465-9ef5-f83578c6593f/prometheus-operator-admission-webhook/0.log" Apr 16 10:55:06.939510 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:06.939481 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75f64b965c-tk4lz_2faa2477-0a90-47a6-b2fd-ef41b76224f7/telemeter-client/0.log" Apr 16 10:55:06.967798 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:06.967773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75f64b965c-tk4lz_2faa2477-0a90-47a6-b2fd-ef41b76224f7/reload/0.log" Apr 16 10:55:06.996594 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:06.996567 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75f64b965c-tk4lz_2faa2477-0a90-47a6-b2fd-ef41b76224f7/kube-rbac-proxy/0.log" Apr 16 10:55:08.180640 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.180609 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp"] Apr 16 10:55:08.181013 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.180962 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9cba92a-4b43-4282-af33-2c578c734436" containerName="gather" Apr 16 10:55:08.181013 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.180973 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cba92a-4b43-4282-af33-2c578c734436" containerName="gather" Apr 16 10:55:08.181013 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.180982 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9cba92a-4b43-4282-af33-2c578c734436" containerName="copy" Apr 16 10:55:08.181013 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.180988 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cba92a-4b43-4282-af33-2c578c734436" containerName="copy" Apr 16 10:55:08.181147 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.181042 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9cba92a-4b43-4282-af33-2c578c734436" containerName="gather" Apr 16 10:55:08.181147 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.181049 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9cba92a-4b43-4282-af33-2c578c734436" containerName="copy" Apr 16 10:55:08.186264 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.186240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.188877 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.188849 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d9gfw\"/\"openshift-service-ca.crt\"" Apr 16 10:55:08.188999 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.188921 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d9gfw\"/\"kube-root-ca.crt\"" Apr 16 10:55:08.189817 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.189801 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d9gfw\"/\"default-dockercfg-glr59\"" Apr 16 10:55:08.195280 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.195258 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp"] Apr 16 10:55:08.250274 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.250239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-podres\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.250274 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.250274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-proc\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.250488 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.250297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8br\" (UniqueName: \"kubernetes.io/projected/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-kube-api-access-rx8br\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.250488 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.250367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-lib-modules\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.250488 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.250453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-sys\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.351714 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.351673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-sys\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.351885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.351741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-podres\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.351885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.351767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-proc\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.351885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.351795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8br\" (UniqueName: \"kubernetes.io/projected/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-kube-api-access-rx8br\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.351885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.351805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-sys\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.351885 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.351830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-lib-modules\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.352096 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.351883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-proc\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.352096 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.351904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-podres\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.352096 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.351942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-lib-modules\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.368182 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.368146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8br\" (UniqueName: \"kubernetes.io/projected/67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d-kube-api-access-rx8br\") pod \"perf-node-gather-daemonset-x65hp\" (UID: \"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.497805 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.497662 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.625895 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.625869 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp"] Apr 16 10:55:08.628597 ip-10-0-135-215 kubenswrapper[2575]: W0416 10:55:08.628565 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod67d40c2d_e7be_4fa1_80ec_cb5cdf350d5d.slice/crio-6263850ff98f5239e3f5bd1d3f7160f2de28420c121f5e0e2abb1c70a177741f WatchSource:0}: Error finding container 6263850ff98f5239e3f5bd1d3f7160f2de28420c121f5e0e2abb1c70a177741f: Status 404 returned error can't find the container with id 6263850ff98f5239e3f5bd1d3f7160f2de28420c121f5e0e2abb1c70a177741f Apr 16 10:55:08.704254 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.704217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" event={"ID":"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d","Type":"ContainerStarted","Data":"23423dfb0979603f71d48cb35c9507cc747cbfc0bc3b020a60f92391fca11649"} Apr 16 10:55:08.704384 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.704260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" event={"ID":"67d40c2d-e7be-4fa1-80ec-cb5cdf350d5d","Type":"ContainerStarted","Data":"6263850ff98f5239e3f5bd1d3f7160f2de28420c121f5e0e2abb1c70a177741f"} Apr 16 10:55:08.704384 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.704354 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:08.722311 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.722268 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" podStartSLOduration=0.72225363 podStartE2EDuration="722.25363ms" podCreationTimestamp="2026-04-16 10:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:55:08.720605781 +0000 UTC m=+2983.557208953" watchObservedRunningTime="2026-04-16 10:55:08.72225363 +0000 UTC m=+2983.558856801" Apr 16 10:55:08.872771 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:08.872738 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54c67d644c-vwmmw_9b184971-0296-4981-a2a4-1f92d9e4bac0/console/0.log" Apr 16 10:55:09.271682 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:09.271607 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-7b6rr_e13bd7e9-dbd9-408e-bd11-e92d36fb4d01/volume-data-source-validator/0.log" Apr 16 10:55:09.945797 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:09.945762 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qjsr9_2b517951-6607-404b-bc0f-a66ec956499a/dns/0.log" Apr 16 10:55:09.965309 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:09.965278 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qjsr9_2b517951-6607-404b-bc0f-a66ec956499a/kube-rbac-proxy/0.log" Apr 16 10:55:10.048968 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:10.048937 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gjdtj_8a82420b-dfa7-4776-ada5-5f24f6e237d2/dns-node-resolver/0.log" Apr 16 10:55:10.531994 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:10.531962 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-57d769fd65-pw5bg_fc7c6c13-2a7a-4b02-8c2e-c3648a528f96/registry/0.log" Apr 16 10:55:10.584715 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:10.584685 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rv2dg_6f990235-90f2-4344-9fbc-4a60dac858ad/node-ca/0.log" Apr 16 10:55:11.343616 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:11.343589 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8677bf5f94-ttzcm_029997ce-820e-46fb-9d03-11be41d65ce4/router/0.log" Apr 16 10:55:11.669376 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:11.669301 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2lw46_44072043-e38d-468a-ae02-9082c94f67cc/serve-healthcheck-canary/0.log" Apr 16 10:55:12.078634 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:12.078601 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-fcg2p_d7b584f5-67ae-4ab7-9eb5-4bd14248b512/insights-operator/1.log" Apr 16 10:55:12.112006 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:12.111975 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-fcg2p_d7b584f5-67ae-4ab7-9eb5-4bd14248b512/insights-operator/0.log" Apr 16 10:55:12.201801 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:12.201770 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jjxh6_a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd/kube-rbac-proxy/0.log" Apr 16 10:55:12.224983 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:12.224950 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jjxh6_a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd/exporter/0.log" Apr 16 10:55:12.246816 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:12.246789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jjxh6_a2d5854b-e8b4-4005-8d0d-bdcbabaf64fd/extractor/0.log" Apr 16 10:55:14.054126 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:14.054094 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-2h7b8_2bc1d90f-b78e-498a-b4ce-74f985771419/jobset-operator/0.log" Apr 16 10:55:14.717760 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:14.717733 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-x65hp" Apr 16 10:55:18.662370 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:18.662344 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvzsv_98bb1e21-3148-43ae-9b64-5d00f0aadc0d/kube-multus-additional-cni-plugins/0.log" Apr 16 10:55:18.684348 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:18.684317 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvzsv_98bb1e21-3148-43ae-9b64-5d00f0aadc0d/egress-router-binary-copy/0.log" Apr 16 10:55:18.736695 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:18.736663 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvzsv_98bb1e21-3148-43ae-9b64-5d00f0aadc0d/cni-plugins/0.log" Apr 16 10:55:18.757285 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:18.757256 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvzsv_98bb1e21-3148-43ae-9b64-5d00f0aadc0d/bond-cni-plugin/0.log" Apr 16 10:55:18.780095 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:18.780067 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvzsv_98bb1e21-3148-43ae-9b64-5d00f0aadc0d/routeoverride-cni/0.log" Apr 16 10:55:18.804352 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:18.804329 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvzsv_98bb1e21-3148-43ae-9b64-5d00f0aadc0d/whereabouts-cni-bincopy/0.log" Apr 16 10:55:18.826840 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:18.826808 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kvzsv_98bb1e21-3148-43ae-9b64-5d00f0aadc0d/whereabouts-cni/0.log" Apr 16 10:55:19.043282 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:19.043207 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5h7c_ed2a524a-708b-4bef-af2f-4363358430af/kube-multus/0.log" Apr 16 10:55:19.145981 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:19.145950 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8r8s4_1591f776-8015-497b-a4bf-80b359c62427/network-metrics-daemon/0.log" Apr 16 10:55:19.167597 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:19.167537 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8r8s4_1591f776-8015-497b-a4bf-80b359c62427/kube-rbac-proxy/0.log" Apr 16 10:55:20.088633 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:20.088600 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7t7h_e21bf7bf-cce9-4deb-8977-30b0f4341386/ovn-controller/0.log" Apr 16 10:55:20.127992 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:20.127960 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7t7h_e21bf7bf-cce9-4deb-8977-30b0f4341386/ovn-acl-logging/0.log" Apr 16 10:55:20.154013 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:20.153981 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7t7h_e21bf7bf-cce9-4deb-8977-30b0f4341386/kube-rbac-proxy-node/0.log" Apr 16 10:55:20.177055 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:20.177022 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7t7h_e21bf7bf-cce9-4deb-8977-30b0f4341386/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 10:55:20.199057 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:20.199031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7t7h_e21bf7bf-cce9-4deb-8977-30b0f4341386/northd/0.log" Apr 16 10:55:20.229344 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:20.229317 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7t7h_e21bf7bf-cce9-4deb-8977-30b0f4341386/nbdb/0.log" Apr 16 10:55:20.258432 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:20.258405 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7t7h_e21bf7bf-cce9-4deb-8977-30b0f4341386/sbdb/0.log" Apr 16 10:55:20.371451 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:20.371373 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7t7h_e21bf7bf-cce9-4deb-8977-30b0f4341386/ovnkube-controller/0.log" Apr 16 10:55:22.059797 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:22.059749 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-5qr2r_a0d64e3c-30b9-4ca7-b47f-f9c59706692a/check-endpoints/0.log" Apr 16 10:55:22.137304 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:22.137274 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rvbk5_b8122b7e-3e94-4772-bea0-462846dfdfab/network-check-target-container/0.log" Apr 16 10:55:22.969185 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:22.969142 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-h8xhq_65a7bc47-f0bb-4b71-acd4-41c6d3d7ea1f/iptables-alerter/0.log" Apr 16 10:55:23.623608 ip-10-0-135-215 kubenswrapper[2575]: I0416 10:55:23.623581 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-xzl5j_99a65d6d-c433-41e5-a73d-38b4e9860935/tuned/0.log"